I recently upgraded to an HDTV set, and I am finally displaying the HD recordings on my MythTV system in HD glory. Like most of my tasks with MythTV, though, it wasn’t just a straightforward connection to the HD set and away we go.
MythTV assumes that the display device has 100 dots per inch of resolution, even though TVs do not. The Sony KDS-50A3000 is a 50″ set with a resolution of 1920×1080, or about 44 dots per inch. (Interestingly, this isn’t all that different from the 40-45 dots per inch of the small standard definition TV I had been using with MythTV.)
The nVidia graphics display drivers calculate dots per inch based on the EDID information. In the case of the KDS-50A3000, it calculated a value of 30 dpi. The resulting font was so tiny I could barely read it with my nose pressed up against the screen. The MythTV wiki describes using the DisplaySize directive to get to the magic 100 dpi. Once I put in the appropriate directive and restarted X, the fonts in the Myth front end were nicely readable.