Wild Thumper --Voltage drop with no current

So this is a new one... and sorta weird.

When I run my motors (or really, just the left channel) of my Wild Thumper controller, I am getting a major Voltage suck but no increase in current. I have tried the standard motor swap (potentially bad with known good), replaced the wires going to the motor(s) and tried both the SLA batt (fully charged) and my desktop power supply. I have also watched the voltage level via the power supply itself, a meter and via the wild thumper itself. All readings agree --We go from 12.5V to around 6.5 or 7v at the start up of the left motor. At the same time, also checked with different "meters", I am only drawing 3A from that motor (actually, both motors are drawing the same --about 3A or so with no load). The problem exists in both Fwd and Rev.

I simply don't have the eletrical engineering knowlege to know what is going on here. I need someone to go all ohm's-law on this problem.

BTW --The wild thumper, in the end, is a dual H-bridge made up of big-honkin' FET's in pretty-much the "standard" configuration.

Am I missing something simple here? If it is as simple as just a blown transistor, I would like to isolate the problem a bit before I whip out the desolder braid. And I gotta add... Drill motors are really friggin' hard on motor drivers.

Doing a search brought some questions to mind.

Voltage drops are due to resistance. Can you check the FETs individually to see if the voltage through one or more drops?

Ohms Law E/IR. If the ‘good’ side shows 12v @ 3A then R=(12.5-12)/3= 0.17 ohms. The other side is (12.5-6.5)/3 = 2 ohms.

I am only guessing that the FETs are run in parallel, so a bad FET or any other component that is in parallel with the left side that has high resistance could really throw a wrench in the works.

Another suggestion that I noticed was to look for the warm component, more heat, more resistance.

Its not the controller…

Well, I checked each FET and everyone is within the range of its neighbor in terms of resistance. I moved on to simply swapping out part after part. I used a total of 5 different test motors, all drills, ranging from a 12v homeowner’s special (cheap) to a 18v pro dewalt model.

  • After more testing --L and R from the controller are equal --they do the same thing when swapped
  • One motor seems to be very happy (a ryobi 14.4 cheapie) --no voltage drop and a 1.7 amp draw
  • Most if not all the DeWalt motors drew in the 3A range but sucked the voltage down to 6 or 7v
  • One Craftsman drill motor (small, cheap and very similar to the ryobi (which works)) also sucked the voltage down
  • Taking a reading of the resistance of the windings of each motor shows they are all about the same (around 1.5 ohms)
  • There seems to be no rhyme or reason to which motor does the voltage suck-down thing 
  • In the past, a “bad” motor was one that started sucking amps but did not affect the supply voltage like it is now.

 

Damn, you nailed it, OddBot

How did you do that all the way from China? Was it a Jedi Mind Trick, OB wan?

It was the SLA and a bad cell. Basically, one small motor flew under the radar while any of the DeWalts were enough to put it over the top. Of course, hindsight is 20/20 in terms of my power supply as well… I have a pretty nice one with control of current and volts and I usually have the current set to max and then adjust my volts to what I need --when powering data circuits current is never an issue. Now, the power supply has a max output of about 4 amps. Badda boom, badda bing, when I started drawing more than that, the power supply lowered it’s own voltage output automatically.

What is the real bottom line here?

My battery was dead and I burned about 10 hours trying to figure it out.  Rolls eyes

Thanks guys.