stuff to fix
Member
- Messages
- 25
I have read several mods on here and dont know which is best. There seem to be two fundamentally different approaches to powering the wire feed motor, either getting the power from the big weld transformer, or providing a separate dedicated supply.
Sip seem to favour getting it from the weld transformer. The reason given seems to be that as you change range and so change the output voltage, the motor automatically adjusts faster or slower. This sounds like it is a good thing but is it?
Another possible reason I see for them to use the weld transformer is that it avoids having to provide a separate transformer, which would cost more to make the welder.
If power comes from the weld transformer then the voltage is higher if there is no arc. So wire will feed faster with no arc than when you strike one. That does not sound so good. I dont know if the voltage varies during welding depending on movements of the torch, how close the wire is to the weld and so on. If it does change, then a bigger arc current will cause the voltage to drop and less wire to be fed. Is this good or bad?
If the current drops because the torch is too far away, then maybe the wire needs to feed faster to get closer in. Alternatively, if the current drops then it may need less metal fed into the weld because it is going slower. Can the motor change speed fast enough to track operator movements waving the torch nearer or further from the metal?
Does anyone know if the current does change like this so it might be causing some of the difficulty people have getting the welder running just right? I would think an experienced welder would have less trouble from this because he would hold the torch steadier and feed it evenly. Is it better to design a motor drive which responds to changing arc current, or design one not to respond to changing arc current?
If there is a separate supply it makes it easier to smooth the power and presumably gives a steadier feed. Some people here have suggested using off the shelf PWM controllers to set the motor speed. Some of the commercial welder designs use a thyristor to have a similar effect by chopping the non-smoothed but rectified ac coming from somewhere, but these circuits also provide some feedback to the motor giving it more current if it runs slow. Probably the commercial controllers will not do this. But they may provide a more regular power feed if they run their chopping at a faster frequency than mains voltage.
The welder thyristor designs turn off motor power briefly when mains voltage passes through zero. The motors will have inertia, but this might mean they slow a little every mains cycle. This might be a good thing, because not much metal melting will be going on when the current is off. Or it might be a bad thing, because it would not hurt for the wire to get a bit closer when the current is off so it will start up again easier.
Some people here have said things get better if you put capacitors on the motor voltage as a simple mod. This might be related to the last point, evening out the spot where otherwise the motor slows, or it may just be that any fix for the original sip design makes things go better.
Does anyone her have experience with more expensive machines to know how they power their motor speed controllers when money is not such an issue but they want performance?
Sip seem to favour getting it from the weld transformer. The reason given seems to be that as you change range and so change the output voltage, the motor automatically adjusts faster or slower. This sounds like it is a good thing but is it?
Another possible reason I see for them to use the weld transformer is that it avoids having to provide a separate transformer, which would cost more to make the welder.
If power comes from the weld transformer then the voltage is higher if there is no arc. So wire will feed faster with no arc than when you strike one. That does not sound so good. I dont know if the voltage varies during welding depending on movements of the torch, how close the wire is to the weld and so on. If it does change, then a bigger arc current will cause the voltage to drop and less wire to be fed. Is this good or bad?
If the current drops because the torch is too far away, then maybe the wire needs to feed faster to get closer in. Alternatively, if the current drops then it may need less metal fed into the weld because it is going slower. Can the motor change speed fast enough to track operator movements waving the torch nearer or further from the metal?
Does anyone know if the current does change like this so it might be causing some of the difficulty people have getting the welder running just right? I would think an experienced welder would have less trouble from this because he would hold the torch steadier and feed it evenly. Is it better to design a motor drive which responds to changing arc current, or design one not to respond to changing arc current?
If there is a separate supply it makes it easier to smooth the power and presumably gives a steadier feed. Some people here have suggested using off the shelf PWM controllers to set the motor speed. Some of the commercial welder designs use a thyristor to have a similar effect by chopping the non-smoothed but rectified ac coming from somewhere, but these circuits also provide some feedback to the motor giving it more current if it runs slow. Probably the commercial controllers will not do this. But they may provide a more regular power feed if they run their chopping at a faster frequency than mains voltage.
The welder thyristor designs turn off motor power briefly when mains voltage passes through zero. The motors will have inertia, but this might mean they slow a little every mains cycle. This might be a good thing, because not much metal melting will be going on when the current is off. Or it might be a bad thing, because it would not hurt for the wire to get a bit closer when the current is off so it will start up again easier.
Some people here have said things get better if you put capacitors on the motor voltage as a simple mod. This might be related to the last point, evening out the spot where otherwise the motor slows, or it may just be that any fix for the original sip design makes things go better.
Does anyone her have experience with more expensive machines to know how they power their motor speed controllers when money is not such an issue but they want performance?