Archive for the ‘electronics’ Category

Using MythTV to display HD recordings on an HDTV set

Thursday, January 3rd, 2008

I’ve been working with digital TV for almost four years now. I first started using it to feed my TiVo a digital signal to kill analog ghosts by downconverting a high-def signal to standard definition for my TiVo. A few months later, I built my built my first MythTV system to record the full high-definition signal. Even though I have been recording high-definition TV broadcasts for three years, I have only begun displaying the recordings in high-definition this week.

One of my holiday presents to myself was a Sony KDS-50A3000 HDTV. I love it. It’s vibrant, has wide viewing angles, and a ton of inputs. I anticipated being able to push a DVI signal to one of the three available HDMI inputs. After ordering an DVI-to-HDMI cable from Blue Jeans Cable, it was time to get everything going.

Displaying HDTV is remarkably easy with recent nVidia drivers under Linux. In the recent (8xxx) builds, there are predefined modes for HD: “1920x1080_60i” is 1080i, “1920x1080_60″ is 1080p, and so on. They go in the Display section of xorg.conf.

SubSection “Display”
Depth 16
# Standard modes for computer workstations
#Modes “1024×768″”800×600″ and “640×480″
#
# New modes for HDTV — 1080p, 1080i, 720p, 480p
Modes “1920x1080_60″ “1920x1080_60i” “1280x720_60″ “720x480_60″
EndSubSection

All is well and good. After running the cable from the DVI port to the HDMI input and restarting the X server, the TV reports that I’m giving it a 720p signal. I have HD, but not at the TV’s native resolution. A check of the log shows me why:

(WW) NVIDIA(0): No valid modes for “1920x1080_60″; removing.
(WW) NVIDIA(0): No valid modes for “1920x1080_60i”; removing.
(II) NVIDIA(0): Validated modes:
(II) NVIDIA(0): “1280x720_60″
(II) NVIDIA(0): “720x480_60″

Something is wrong with the 1080p and 1080i modes. The MythTV system is still using the GeForce FX5200 card, which is perfectly competent to drive any SDTV display, but many posts indicate it has problems pushing 1080i out the DVI port.

So, it’s time to get down and dirty. Running Xorg -logverbose 10 -probeonly will cause the server to dump everything. It’s interesting reading.

First of all, I note that the card is detected correctly, and, though it’s a bit confusing, the FX5200 has a maximum pixel clock of 135 MHz. (As an aside, a 135 MHz is sufficient to do 1280×1024@75 Hz.)

(–) NVIDIA(0): Connected display device(s) on GeForce FX 5200 at PCI:1:0:0:
(–) NVIDIA(0): Sony TV XV (DFP-0)
(–) NVIDIA(0): Sony TV XV (DFP-0): 135.0 MHz maximum pixel clock

A bit further down, the server probes the TV using the Extended Display Information Data (EDID) protocol. In the section it reports, I learn that it wasn’t made long before I bought it, and that it really, really wants to be fed a signal with a 60 Hz refresh rate.


(–) NVIDIA(0): — EDID for Sony TV XV (DFP-0) —
(–) NVIDIA(0): EDID Version : 1.3
(–) NVIDIA(0): Manufacturer : SNY
(–) NVIDIA(0): Monitor Name : Sony TV XV
(–) NVIDIA(0): Product ID : 33536
(–) NVIDIA(0): 32-bit Serial Number : xxxxxxxx
(–) NVIDIA(0): Serial Number String :
(–) NVIDIA(0): Manufacture Date : 2007, week 36
(–) NVIDIA(0): DPMS Capabilities :
(–) NVIDIA(0): Prefer first detailed timing : Yes
(–) NVIDIA(0): Supports GTF : No
(–) NVIDIA(0): Maximum Image Size : 1600mm x 900mm
(–) NVIDIA(0): Valid HSync Range : 15 kHz – 70 kHz
(–) NVIDIA(0): Valid VRefresh Range : 58 Hz – 62 Hz
(–) NVIDIA(0): EDID maximum pixel clock : 150.0 MHz

Continuing down the log, I see why the 1080p mode was rejected. 1080p requires that you feed the TV pixels at almost 150 MHz, which is too fast for the pixel clock:

(II) NVIDIA(0): Validating Mode “1920×1080″:
(II) NVIDIA(0): 1920 x 1080 @ 60 Hz
(II) NVIDIA(0): Mode Source: EDID
(II) NVIDIA(0): Pixel Clock : 148.50 MHz
(II) NVIDIA(0): HRes, HSyncStart : 1920, 2008
(II) NVIDIA(0): HSyncEnd, HTotal : 2052, 2200
(II) NVIDIA(0): VRes, VSyncStart : 1080, 1084
(II) NVIDIA(0): VSyncEnd, VTotal : 1089, 1125
(II) NVIDIA(0): H/V Polarity : +/+
(WW) NVIDIA(0): Mode is rejected: PixelClock (148.5 MHz) too high for
(WW) NVIDIA(0): Display Device (Max: 135.0 MHz).

However, the 1080i mode is rejected because the interlaced mode is

(II) NVIDIA(0): Validating Mode “1920×1080″:
(II) NVIDIA(0): 1920 x 540 @ 60 Hz
(II) NVIDIA(0): Mode Source: EDID
(II) NVIDIA(0): Pixel Clock : 74.18 MHz
(II) NVIDIA(0): HRes, HSyncStart : 1920, 2008
(II) NVIDIA(0): HSyncEnd, HTotal : 2052, 2200
(II) NVIDIA(0): VRes, VSyncStart : 540, 542
(II) NVIDIA(0): VSyncEnd, VTotal : 547, 562
(II) NVIDIA(0): H/V Polarity : +/+
(II) NVIDIA(0): Extra : Interlaced
(WW) NVIDIA(0): Mode is rejected: VertRefresh (120.0 Hz) out of range
(WW) NVIDIA(0): (58.000-62.000 Hz).

By adding 120 Hz as a refresh rate and overriding the information from the display, I could push a 1080i signal to the TV. However, what I found is that a 720p signal on MythTV looked awful when converted by MythTV to 1080i. There is some loss of picture quality when displaying a 1080i signal as 720p, but it is much less noticeable. Therefore, I will stick with 720p until I can figure out how to switch video modes on the fly between 1080i and 720p.

eneloop charge lifetime anecdote #1

Thursday, September 13th, 2007

A couple of weeks ago, I recommended Sanyo’s eneloop low-discharge nickel-metal hydride rechargeables. The first set I’ve bought has finally completely discharged. I purchased a package of four AA batteries to power my Canon Speedlite. The eneloops are shipped fully charged and can be used immediately; when mine arrived on July 19, I put them into the camera and started blasting away with the flash. That set was finally exhausted on August 30. (I use the flash every week, but I am not a professional photographer who needs multiple fully charged sets of batteries per day.)

I was so impressed with the long-term charge retention that I decided to convert over to eneloops for my Logitech Harmony remote. I’ve been feeding it alkaline AAA batteries for three years, in large part because remote controls are a low-draw application that is not well suited to rechargeable batteries. My first set of AAAs for the Harmony was delivered on August 17, and I’ll post an update when those need to be recharged.