What is the key rule for ensuring that a different laptop charger can work with a laptop?

I've always been wary since a few years ago when I used a laptop charger from a different brand laptop, not looking at the charger's electrical specifications, and it ruined my battery, essentially turning the laptop into an A/C only computer.

My current laptop is a Toshiba, as is my smaller one, and the chargers for each have the same voltage and amperage. I thus assume that they would be safely interchangeable.

However, do voltage and amperage have to be the same to be safe? I would suspect that a higher voltage would wreck the laptop or battery, as with a higher amperage. I don't have a great knowledge of power, however.

These are my guesses; could you please verify each and elaborate? I'd like to learn why something is good or bad. Do both the input and output specs have to match the criteria? Or is only the output important?

  1. Higher voltage & higher amperage: bad
  2. Higher voltage & same amperage: bad
  3. Higher voltage & low amperage: bad
  4. Same voltage & higher amperage: bad
  5. Same voltage & same amperage: good
  6. Same voltage & lower amperage: good, but slower charge
  7. Lower voltage & higher amperage: bad
  8. Lower voltage & same amperage: good, but less power and slower charge
  9. Lower voltage & lower amperage: good, but less power and slower charge

Thank you

(PS: I suppose the same would apply to phones, etc?)


Basics (for chargers):

Higher voltage & higher amperage: bad / possible damage to circuits

Higher voltage & same amperage: bad / possible damage to circuits

Higher voltage & low amperage: bad / possible damage to circuits

  • Same voltage & higher amperage: good

  • Same voltage & same amperage: good

  • Same voltage & lower amperage: risky (can overheat charger)

Lower voltage & higher amperage: bad (for the cells)

Lower voltage & same amperage: bad (for the cells)

Lower voltage & lower amperage: bad (for the cells)

Practically, you need the same voltage for a correct charge. But it's not that simple. Certain chargers state the voltage a little different, which is not important. 19V will work with 19.5V and vice-versa.

(Example: An 18650 cell which is typical for notebooks has a full charge voltage of 4.2V. A 4 cell in series would sum to 16.8V. So practivally any charger rated between 18.5 and 19.5 should be fine for that.)

Now for the logic of good and bad in the case of good compatible voltage: in the case of higher charger current, there is no problem because the cells and circuits will draw the power they need, no more. So the practical effect is that the charger will be less hot (won't work at full potential). In the case of the charger with less current, the risk is that the cells will demand higher current from it and it will be unable to provide that and end up being overheated or even burned.

In the situation of lower charger voltage, the cells at best will charge very little, but this situation is rarely encountered since there are big step differences between charges for different amount of cells in series: ~11V is for 2 series, ~14-15V for 3 series and ~18.5-19.5V for 4 series of cells. Now using a 15V charger for a 4-series can in theory charge the cells a little, but totally insufficient.

  • Perfect! Thank you :) But I am curious - why is low voltage bad for the cells? Is it because they have to draw power for a much longer time? – Dog Lover Apr 12 '17 at 6:00
  • 1
    They will not sufficiently charge and that will result in big pair charge unbalance. Cells today unfortunately charge in series (in most cases). On the long term, assuming you have 3 pairs, one will always charge at max, one will be nearly charged and one undercharged (which will cause it eventually to fail and then the battery becomes unusable). An incomplete charge only accelerates this process and the worst pair will fail much faster. – Overmind Apr 12 '17 at 6:03
  • Oh, also - is the input voltage and amperage relevant? If so, if it fulfills one of the "bad" criteria, is the charger bad to use? i.e. Does both input and output have to fulfill a "good" criterion? – Dog Lover Apr 12 '17 at 6:08
  • 1
    You need to be worried about the output of the power supply and ensure that (a) the Voltage matches the input required by the computer and (b) the current the PSU can provide is at least as much as the computer draws - but it can be more. The question about input and output fulfilling criterion don't really make a lot of sense in this respect (but yes, both input and output have to fulfill the good criterion) – davidgo Apr 12 '17 at 6:22
  • 1
    @DogLover - pretty much (except there is no transformer, there are electrical bits which do the same job much more efficiently). You do, of-course, need to ensure the power supply matches the wall voltage [ but that should not be a problem for a modern device - they typically accept both 100-120 and 200-240 volts, which covers everything. – davidgo Apr 12 '17 at 9:57

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service, privacy policy and cookie policy

Not the answer you're looking for? Browse other questions tagged or ask your own question.