Archive for January, 2008

Fixing small font displays on MythTV HD displays

Sunday, January 6th, 2008

I recently upgraded to an HDTV set, and I am finally displaying the HD recordings on my MythTV system in HD glory. Like most of my tasks with MythTV, though, it wasn’t just a straightforward connection to the HD set and away we go.

MythTV assumes that the display device has 100 dots per inch of resolution, even though TVs do not. The Sony KDS-50A3000 is a 50″ set with a resolution of 1920×1080, or about 44 dots per inch. (Interestingly, this isn’t all that different from the 40-45 dots per inch of the small standard definition TV I had been using with MythTV.)

The nVidia graphics display drivers calculate dots per inch based on the EDID information. In the case of the KDS-50A3000, it calculated a value of 30 dpi. The resulting font was so tiny I could barely read it with my nose pressed up against the screen. The MythTV wiki describes using the DisplaySize directive to get to the magic 100 dpi. Once I put in the appropriate directive and restarted X, the fonts in the Myth front end were nicely readable.

Context is everything in statistics

Sunday, January 6th, 2008

Catching up on my blog reading, I found a post where Richard Florida writes about immigration in the heartland. He relates a story from when he served on a panel in 2003 for the governor of Iowa on the future of the state’s economy, where a conference attendee stated:

I’m the son of Mexican immigrants, both low-skilled. I’m also a recent graduate of Grinnell College[one of the most respected small liberal arts colleges in the country]. Of my graduating class, only five of us have decided to stay in state of Iowa.

I’m a not-so-recent graduate of Grinnell. Many of my peers pursued graduate school, and many left Iowa for the workforce. A large number of graduates stayed in Iowa, where the career office’s connections were strongest. When I graduated, the career office pushed me to stay in Iowa, and didn’t seem to want to help me leave the state. I find it highly improbable that several years later, only five of roughly 300 to 400 graduates would stay in the state, so I wish there was a bit more context given for the number five.

Using MythTV to display HD recordings on an HDTV set

Thursday, January 3rd, 2008

I’ve been working with digital TV for almost four years now. I first started using it to feed my TiVo a digital signal to kill analog ghosts by downconverting a high-def signal to standard definition for my TiVo. A few months later, I built my built my first MythTV system to record the full high-definition signal. Even though I have been recording high-definition TV broadcasts for three years, I have only begun displaying the recordings in high-definition this week.

One of my holiday presents to myself was a Sony KDS-50A3000 HDTV. I love it. It’s vibrant, has wide viewing angles, and a ton of inputs. I anticipated being able to push a DVI signal to one of the three available HDMI inputs. After ordering an DVI-to-HDMI cable from Blue Jeans Cable, it was time to get everything going.

Displaying HDTV is remarkably easy with recent nVidia drivers under Linux. In the recent (8xxx) builds, there are predefined modes for HD: “1920x1080_60i” is 1080i, “1920x1080_60″ is 1080p, and so on. They go in the Display section of xorg.conf.

SubSection “Display”
Depth 16
# Standard modes for computer workstations
#Modes “1024×768″”800×600″ and “640×480″
#
# New modes for HDTV — 1080p, 1080i, 720p, 480p
Modes “1920x1080_60″ “1920x1080_60i” “1280x720_60″ “720x480_60″
EndSubSection

All is well and good. After running the cable from the DVI port to the HDMI input and restarting the X server, the TV reports that I’m giving it a 720p signal. I have HD, but not at the TV’s native resolution. A check of the log shows me why:

(WW) NVIDIA(0): No valid modes for “1920x1080_60″; removing.
(WW) NVIDIA(0): No valid modes for “1920x1080_60i”; removing.
(II) NVIDIA(0): Validated modes:
(II) NVIDIA(0): “1280x720_60″
(II) NVIDIA(0): “720x480_60″

Something is wrong with the 1080p and 1080i modes. The MythTV system is still using the GeForce FX5200 card, which is perfectly competent to drive any SDTV display, but many posts indicate it has problems pushing 1080i out the DVI port.

So, it’s time to get down and dirty. Running Xorg -logverbose 10 -probeonly will cause the server to dump everything. It’s interesting reading.

First of all, I note that the card is detected correctly, and, though it’s a bit confusing, the FX5200 has a maximum pixel clock of 135 MHz. (As an aside, a 135 MHz is sufficient to do 1280×1024@75 Hz.)

(–) NVIDIA(0): Connected display device(s) on GeForce FX 5200 at PCI:1:0:0:
(–) NVIDIA(0): Sony TV XV (DFP-0)
(–) NVIDIA(0): Sony TV XV (DFP-0): 135.0 MHz maximum pixel clock

A bit further down, the server probes the TV using the Extended Display Information Data (EDID) protocol. In the section it reports, I learn that it wasn’t made long before I bought it, and that it really, really wants to be fed a signal with a 60 Hz refresh rate.


(–) NVIDIA(0): — EDID for Sony TV XV (DFP-0) —
(–) NVIDIA(0): EDID Version : 1.3
(–) NVIDIA(0): Manufacturer : SNY
(–) NVIDIA(0): Monitor Name : Sony TV XV
(–) NVIDIA(0): Product ID : 33536
(–) NVIDIA(0): 32-bit Serial Number : xxxxxxxx
(–) NVIDIA(0): Serial Number String :
(–) NVIDIA(0): Manufacture Date : 2007, week 36
(–) NVIDIA(0): DPMS Capabilities :
(–) NVIDIA(0): Prefer first detailed timing : Yes
(–) NVIDIA(0): Supports GTF : No
(–) NVIDIA(0): Maximum Image Size : 1600mm x 900mm
(–) NVIDIA(0): Valid HSync Range : 15 kHz – 70 kHz
(–) NVIDIA(0): Valid VRefresh Range : 58 Hz – 62 Hz
(–) NVIDIA(0): EDID maximum pixel clock : 150.0 MHz

Continuing down the log, I see why the 1080p mode was rejected. 1080p requires that you feed the TV pixels at almost 150 MHz, which is too fast for the pixel clock:

(II) NVIDIA(0): Validating Mode “1920×1080″:
(II) NVIDIA(0): 1920 x 1080 @ 60 Hz
(II) NVIDIA(0): Mode Source: EDID
(II) NVIDIA(0): Pixel Clock : 148.50 MHz
(II) NVIDIA(0): HRes, HSyncStart : 1920, 2008
(II) NVIDIA(0): HSyncEnd, HTotal : 2052, 2200
(II) NVIDIA(0): VRes, VSyncStart : 1080, 1084
(II) NVIDIA(0): VSyncEnd, VTotal : 1089, 1125
(II) NVIDIA(0): H/V Polarity : +/+
(WW) NVIDIA(0): Mode is rejected: PixelClock (148.5 MHz) too high for
(WW) NVIDIA(0): Display Device (Max: 135.0 MHz).

However, the 1080i mode is rejected because the interlaced mode is

(II) NVIDIA(0): Validating Mode “1920×1080″:
(II) NVIDIA(0): 1920 x 540 @ 60 Hz
(II) NVIDIA(0): Mode Source: EDID
(II) NVIDIA(0): Pixel Clock : 74.18 MHz
(II) NVIDIA(0): HRes, HSyncStart : 1920, 2008
(II) NVIDIA(0): HSyncEnd, HTotal : 2052, 2200
(II) NVIDIA(0): VRes, VSyncStart : 540, 542
(II) NVIDIA(0): VSyncEnd, VTotal : 547, 562
(II) NVIDIA(0): H/V Polarity : +/+
(II) NVIDIA(0): Extra : Interlaced
(WW) NVIDIA(0): Mode is rejected: VertRefresh (120.0 Hz) out of range
(WW) NVIDIA(0): (58.000-62.000 Hz).

By adding 120 Hz as a refresh rate and overriding the information from the display, I could push a 1080i signal to the TV. However, what I found is that a 720p signal on MythTV looked awful when converted by MythTV to 1080i. There is some loss of picture quality when displaying a 1080i signal as 720p, but it is much less noticeable. Therefore, I will stick with 720p until I can figure out how to switch video modes on the fly between 1080i and 720p.