Roughly within an order of magnitude or so, either way.
The wavelength used in the microwave oven is about 12 centimeters. (rounded) In the early days, there were radar systems that used wavelengths longer than that, but there haven't been any for several decades now.
Inches (microwave) vs. Feet (Radar).
The average wavelength of microwaves is typically around 12.2 centimeters, which corresponds to a frequency of 2.45 GHz. These wavelengths are commonly used in household microwave ovens for cooking food.
Roughly within an order of magnitude or so, either way.
Roughly within an order of magnitude or so, either way.
Microwaves are wavelengths of light that are shorter than radio waves, but longer than infrared light waves. The wavelength range is 1 meter to 1mm. One particular wavelength is optimum for being absorbed by water, and this is what "microwave ovens" are tuned to produce. This is 2.45 GHz, which is about 12.2cm wavelength.
A photon of this wavelength has an energy of about 10 ^ -5 eV.
AM radio, FM radio, television, cellphones, GPS, and microwave ovens all use radio waves that are longer than the ones used for RADAR.
Microwave ovens cook with several hundred watts of radio waves at a frequency at or near 2.450 GHz (2,450 MHz) / wavelength = 12 cm.
Microwave ovens operate at 2.45G. http://www.howstuffworks.com/microwave.htm
Microwave Ovens
Scientists group electromagnetic waves into categories according to their wavelength. Technically a radio wave has a longer wavelength then a microwave but that's the only fundamental difference. They are both electromagnetic waves.