Yes, any cable has an electrical resistance that is proportional to its length. The greater the resistance and the greater the current drawn, the greater the voltage drop across the cable. i.e. you could be connecting 230V to one end of the cable and when you start welding at an expected current of 150A, the input voltage at the welder drops to, say, 210V and you can't get all the current you expect. These figures are made up to illustrate the point. I don't know what the real figures would be. The effect would be much less when welding at lower currents.What would happen just effectively drain the power?
If you have to run an extension that far, perhaps you could try to use a well-overspecified cable. e.g. a 32A extension instead of a 13A one. Again I don't know where you would get one but I think "Arctic" cable is available in higher current ratings. A cable with a higher current rating would have a lower resistance.
50m is significant so if you will be welding near the maximum of the welder's capacity you will probably need to tackle this problem.