High-bandwidth Digital Content Protection, or HDCP, is a security measure designed to block unauthorized copying of Blu-ray and other HD content. If you want to play movies on a computer connected to any high definition monitor or TV, both the video card must be HDCP compliant. Practically all HDTVs and monitors are HDCP-ready today, as are video cards that have an HDMI or DVI connector.
Background
Video of the Day
The arrival of HDCP in 2004 was a dark day for video pirates, who encountered a significant new barrier. It also brought headaches for consumers who unknowingly connected devices that do not share HDCP compatibility. Intel Corp. developed the technology, which is licensed by Digital Content Protection LLC. Beyond video cards and Blu-ray players, HDCP is common on newer monitors, TVs, cable boxes and satellite receivers.
Video of the Day
Video Cards
HDCP primarily targets high-definition devices with DVI and HDMI video ports. Most modern video cards have either a DVI output for connecting to a monitor or HDMI. In order to play HDCP content on an external monitor, computers need an HDCP-ready video card and monitor. If you are playing HDCP content on a laptop using its built-in screen, you don't need an HDCP-ready video card.
HDMI One Decade Later
Since it's introduction in 2004, HDCP is still prevalent in the computer industry 10 years later. Few, if any, high definition video cards are not HDCP-ready in 2014. If you're not certain, however, you should check the card's specifications before purchasing one. HDCP does not affect low-resolution or analog monitors, so if you can connect a monitor to a computer using a VGA port, for example, you can still play high definition movies. Of course, using this method will reduce the video's quality.