The difference between a converter and a transformer lies in how the device converts voltage current. Alternating current power is supplied in alternating bursts that are in a shape called a "sine wave".
To reduce 230 V to 120 V, for example, a converter delays the start of the sine wave such that the average voltage (actually the root-mean-square) over a full wave is lowered. The high voltage peaks are unfortunately still present and this is what destroys electronic equipment, usually because the resultant voltage is rectified to the full pre-converted value. Appliances such as light bulbs and heaters don't care about those peaks and many motors also are tolerant of them.
A transformer, on the other hand, alters the amplitude of the waves. This is a critical difference because electronic devices cannot cope with high voltage peaks which are still present when lowering voltage by means of a converter.
The converter's delaying of sine waves is a relatively simple and compact function. The transformer's alteration of sine waves is a relatively sophisticated function and requires more space. As a result, transformers are generally larger, heavier and much more expensive than converters.