Yes and no.
Domestic HD video uses the HDMI interface to carry signals. HDMI not only carries video but also handles audio and control functions.
DVI is a similar interface and it entirely compatible with HDMI video. An image carried by DVI will be identical to that carried by HDMI. Unlike HDMI, DVI does not support audio and therefore is not generally accepted as an HD video interface.
You will need a DVI cable. Note that DVI can be a digital signal, an analog signal or both. Look in the manuals the see which the computer and monitor support. It is normally digital, which might be referred to as DVI-D. If analog is supported, it will be called a DVI-A. A DVI connection that carries both will be called a DVI-I. If you find you have a choice, use the digital signal rather than the analog.
HDMI and DVI are both interfaces that have specific cables and both support HD video. Component and composite both refer to methods of encoding information to represent a moving image and they are not types of cable. Equipment will often have connectors for composite or component inputs and outputs but the type of connector and cable used can vary from one item to another. However, to answer the test question that has been used as this question, HDMI and DVI support digital video data including HD video. Component specifies a signal format that can be digital or analog and also supports HD video. Composite is a format for a single encoded signal that supports standard definition only.
Yes will work the same as HDMI, just DVI will not carry audio. DVI is capable of 1080p
VCRs do not support HDTV. You will not be able to record HD. You will be able to record a converted HD signal but will lose the quality.
There are three forms of DVI; DVI-A, DVI-D and DVI-I. The "A" stands for Analog, the "D" represents Digital and "I" is a combination of the two. The adapter sends an uncompressed video/audio signal to a receiver such as a monitor, but the signal itself can be either digital, analogue or both.
Yes and yes. "dvi-d with hdcp" Here is my understanding of it all. First dvi-d means digital signal only. hdcp (HD) means High-definition (CP) means copy protected i.e. you can't record the signal. As for the other part of the question. Is it able to be viewed like a tv and do you need to subscribe to hd with your cable provider?" Yes but, you have to have a cable box with a DVI output connector. Or some newer DVD players have dvi output. You didn't say what the name of your cable service is so i don't know if you have to subscribe (pay that is ) for HDTV or not. I'm on Cox cable and get HDTV for free.
The advantages of Dvi is that its a digital signal and its so widely used.
HDMI is the newer type of video out or in connector available on the market while DVI is the one before HDMI both can display beautiful HD pictures on a HD compatible screen.
HDMI is the standard interface for HD video today. It is only a cable connectin and it carries whatever signal is given to it to carry. The signal can be standard definition video, high definition video or it can be DVI computer display data. What it won't do is to make a signal into high definition. It will only carry signals as they are generated. However, if it's connected from a PC to a television, the PC can interrogate the television, find out the native resolution of the television and then generate a suitable signal. That guarantees the best possible quality of image. Not all PCs will do this so you may still need to set the output resolution manually. Also, be aware that DVI is compatible with HDMI in that they handle video signals in the same way. Unlike HDMI, DVI does not support audio. Therefore, a DVI to HDMI cable will connect PCs with DVI outputs to televisions that only have HDMI ports.
VGA can certainly support HD standards. VGA is a hardware standard for display interfaces and supports numerous resolutions including 720p, 1080i and 1080p. In fact, VGA ports often support even higher resolutions. The practical use of VGA for HD display has some limitations due to commercial restrictions.Commercial HD content delivered via disc, cable or satellite is exclusively delivered to a display using HDMI. This is a digital interface and supports HDCP, a protection system that encodes the signal to prevent copying HD material. Commercial HD players and receivers do not output HD signals through a VGA port as HDCP cannot be implemented on VGA ports.Non commercial HD content and some downloaded content can make good use of VGA displays but generally, HD material will demand HDMI rather than VGA. It is worth noting that DVI outputs on computers are compatible with HDMI but do not support HDCP, Therefore, the use of DVI is limited in a similar way to VGA.
This monitor does support DVI outputs.
Rule of thumb: if your output is digital, try to keep it digital all the way. If your output is analog, same deal. The more times you switch over, the worse the end result will be.If your computer supports a digital (DVI) output, and your monitor does too, then keeping the signal all digital helps maintain the integrity of the feed. Every time you switch between a digital signal (DVI) and an analog (VGA) signal, you're "interpreting" between the two. Interpreting always involves a degree of error. That might mean frame stuttering in motion video, or horizontal asynch in stillframes.The best approach is to avoid interpreting as much as possible. If your computer can support DVI out, and your display can support DVI in, then connect them directly with a DVI cable. If one can support DVI and the other is VGA, a single switchover is best.In essence, the fewer translations the better. I frequently see DVI computers connected to DVI displays through two VGA conversions. (The cable connections translate DVI to VGA, and then back again.) That's basically "interpreting" a true-to-life signal two times; and each time comes with loss of data. The fewer of those conversions, the more true-to-life your output will be.