[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: frequency vs core saturation



Original poster: <davep@xxxxxxxx>


>      Here is a question for all you mathy theoretical technical
> guys.  I heard In physics class tonight, that if you use a higher
> frequency you can use a smaller core for a transformer.  In my mind
> this means, if you use a higher frequency, you can pump more voltage
> through your primary, and consequently get more out of your
> secondary.  This means (provided you have enough insulation) you
> could use a much higher voltage in your TC primary, with the same
> transformer, by changing the input voltage and frequency, without
> saturating the transformer core.  So my question is, is this
> beneficial in any way?  Is this even true, or is my logic flawed?  I
> (as of right now) have no means with which to experiment with
> frequency and input voltage, but mabey somebody else out there
> does.  Thanks a heap.
    Short answer:
     Its more complicated than that.

    Longer answer:  A transformer designed for 60 Hz (ish)
    operation will operate poorly, inefficiently, and overheat, if run
    up in frequency.  For HF use (eg: a.c standard fo 400 Hz ish) the
    transformers (motors, etc) are specially designed, notably with
    special core materials.  Modern Developments in core materials have
    made such cores less expensive, still need to reinsulate the
    secondary.

    Tesla (and others of his day) walked this road:  he started by
    running alternators fast, building special ones to go faster,
    leapfrogged to using 'Tesla Coil' to get higher freqs yet.

    (Books could be/have been writ on this, i stop here...)

     best
      dwp