Talk:1080i

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

29,97fps vs 30fps?[edit]

Someone cares to add WHY the 0.03 fps difference exists; how it came about, and what the reason is of it's existence? Thanks! —Preceding unsigned comment added by 71.206.65.120 (talk) 04:03, 17 March 2009 (UTC)Reply[reply]

It came about with the move to NTSC colour. It solved certain problems that existed had the frame rate remained at 30fps.20.133.0.13 (talk) 13:31, 27 October 2009 (UTC)Reply[reply]

I believe the reason is to avoid a beat with 60hz electricity. 29.97 certainly existed before NTSC color. Adamgoldberg (talk) 12:49, 15 September 2021 (UTC)Reply[reply]

No it didn't. It was the introduction of colour that necessitated the slight change. Had the field rate remained at 60Hz, it gave a visible dot crawl pattern on the screen. Reducing the field rate by 0.1% to 59.94Hz solved the problem. By this time the accelerating voltages on the CRT were high enough that the slightly different mains frequency did no give a visible interference pattern because of stray magnetic fields. 86.162.147.159 (talk) 18:20, 5 October 2022 (UTC)Reply[reply]

1080i v. p[edit]

I am removing the line about 1080p offering no advantage and being generally unsupported. In addition to it now becoming more widely supported by the latest generation of LCD sets, the statement about it offering no advantage is unsupported by references and certanly arguably untrue (refer to the article on interlacing). For displays such as DLP, Plasma and LCD interlacing must be removed and causes visible artifacts in the process.

Robbins 06:20, 28 November 2006 (UTC)Reply[reply]

Is there any reason why anyone would prefer 1080i over 1080p except if you have a CRT based HDTV set? In uncompressed format the data rate, i.e. pixels per second, is the same and it is easier and more efficient to compress video in progressive format than in interlaced format.

As has been added to the article, many current LCD TVs do not support the 1080p standard. It offers no advantage. Interlaced scanned formats have always offered a resolution advantage over progressive scan (plus a reduction in flicker on CRTs). This is primarily why the EMI 405i system triumphed over the Baird 240p system in 1936. Had LCD displays been around in 1936, the outcome may have not have been so clear cut, though the greater portability of the EMI all electronic cameras offered a significant advantage over the Baird Film based cameras which required a water supply for the film processing plant in the camera base (which meant that they were bolted to the studio floor).
The previous poster states that interlaced scanning offers a higher resolution than progressive scanning. For any given line standard this is simply not true. For example 1080i/25 has a lower vertical resolution than 1080p/25. In fact, the apparent vertical resolution of 1080i is approximated by that of 720p. Comparing a 405-line interlaced electronic system with a 240-line progressive mechanical system is hardly fair! 82.127.93.74 17:19, 19 January 2007 (UTC)Reply[reply]
Yes, when using the same bandwidth 1080p has 25frames per second at a res of 1080. 1080i has 50frames per second of a res of 540. The later is usefull in sports where the second frame is not just the odd-numbered lines, but also the action happening 1/50 second later. This gives you 50 live updates per second opposed to 25, and is the reason why sports are much better on interlaced CRT screens than on most de-interlaced flat-screens. Ofcourse if you compare double bandwidth 1080p60 then it wins over 1080i every time. Carewolf 13:16, 2 August 2007 (UTC)Reply[reply]
1080i is not necessarily better for sports. As you get half the horizontal lines updated in each update if something is moving across the screen you see it 'stretched' as half of it is 1/60th of a second 'ahead' of the rest of it. I include a link for a potential future update to this poorly written misinformed wiki here - Anomalous result (talk) 09:09, 12 December 2007 (UTC)Reply[reply]

The main article states: "Because of interlacing 1080i has half the vertical resolution of 1080p." This is not true. 1080i and 1080p have exactly the same spatial resolution but 1080i has less temporal resolution. RastaKins 05:06, 21 March 2007 (UTC)Reply[reply]

As far as I'm aware, the main reason 1080i still exists is broadcast engineers reluctance to change.. There is no technical reason interlacing is needed anymore (TVs are capable of display progressive scan quite easily now), but it's still around (it's much the same as the obscure 29.97frames/second framerate still existing) - the benefits are questionable (claiming interlacing "doubles the frame rate" is silly, given that there's basically no perceivable difference between 25 and 50fps - if there was, cinema wouldn't still be 24fps..) 81.152.116.183 (talk) 02:37, 18 July 2008 (UTC)Reply[reply]

There is a big perceivable change between 24/25fps and 50fps. 24fps has a distinctive 'film' look while 50fps has a much smoother look that is associated with video (for obvious reasons). However, since almost all film and TV is shot at 24 or 25fps, you would gain nothing from 1080p over 1080i for these, as they'd both be showing the same frames at the same resolution. Teppic74 (talk) 13:05, 24 September 2011 (UTC)Reply[reply]

It seems few are capable of understanding temporal resolution, which operates in the human visual cortex, and that interlaced video, if presented accurately, provides more total resolution than progressive for the same frame rate (two fields in interlaced). The fact that monitor and TV manufacturers wish to abandon it makes no difference, and in fact becomes a self-fulfilling prophecy in that, yes, since the devices alter the signal, the picture is now somewhat slightly degraded in addition to the loss of temporal resolution. — Preceding unsigned comment added by Mydogtrouble (talkcontribs) 13:58, 13 March 2013 (UTC)Reply[reply]

1440px wide?[edit]

Maybe I'm completely wrong, but I'm sure I've read that 1080i is (often if not always?) 1440*1080 before being stretched to 16:9 (like DVD), at least when being broadcast on terrestrial television. Nova Prime 11:51, 17 January 2007 (UTC)Reply[reply]

Most HD cameras only output 1440 pixels per line in 1080 mode and several broadcast HD VTR formats only record 1440 samples per line in 1080 mode, in order to reduce costs. The signals are re-sampled and interpolated back to 1920 per line on the outputs of these devices, however. 82.127.93.74 17:19, 19 January 2007 (UTC)Reply[reply]
ATSC supports only 1920x1080 frames (1920x540 fields) for 1080i MPEG-2 transmission (although the FCC didn't accept that recommendation so in theory any MPEG-2 MP@HL resolution is supported for use within the US.) ATSC was recently revised to support MPEG-4 Part 10 (AVC/H.264) for video, and 1440x1080 is a supported resolution for that standard, although the FCC hasn't accepted that recommendation either (which in this case means that TV stations can't broadcast a primary signal in H.264)
So, the bottom line is, a TV station can theoretically broadcast 1440x1080i60 if they're in the US, but there's a strong chance many TVs simply will not display the signal. I'd be inclined to believe TV stations would be more likely to broadcast a 1440x1080p(45-50) (using RFF) stream if they're going to ignore the ATSC recommendation and just do what's possible, as that would allow them to broadcast 1080p at a high framerate while still keeping the stream under 20Mbps.
And then again, the major reason to broadcast a high frame rate is when showing sports. If you really want to make your TV station unpopular, broadcast sports using a resolution that might not be supported by everyone's TVs... —Preceding unsigned comment added by 66.149.58.8 (talk) 11:52, 12 July 2009 (UTC)Reply[reply]

Deprecated terminology[edit]

The terminology used within this article is deprecated. The preferred terminology as described by the ITU and SMPTE has the following format: xxxxy/zz where xxxx is the number of active lines per picture (usually 1080 or 720 when discussing high definition), y is the scanning mode (indicated by a letter i for interlaced scanning or a letter p for progressive scanning), next comes a slash character, and zz is the refresh rate of the picture. Thus standard definition television as used in Europe would be described as 576i/25. 82.127.93.74 17:40, 19 January 2007 (UTC)Reply[reply]

"Solidus"?[edit]

The second paragraph says:

 Others, including the European Broadcasting Union (EBU), prefer to use the frame rate 
 instead of the field rate and separate it with a solidus from the resolution as in 1080i/30 
 and 1080i/25, likewise 480i/30 and 576i/25.

The word "solidus" links to the page for the [mark "slash,"] which includes a very stuffy note that "slash" and "solidus" are not the same. Jackrepenning 22:05, 22 January 2007 (UTC)Reply[reply]

Yeah, that's ridiculous. "Solidus" ??? As opposed to what, "liquidus"? Gimme a break. I replaced it with the word "slash".

What to reference[edit]

When writing a paper you are not suppose to list references for someone who is reasonably skilled in the field would know already. For example 1080i60 uses the same bandwidth 1080p30. Most people who knowing something about video display know this is true. A simple calculation of the image size times the frame rate shows this is true: 60*1920*540= 30*1920*1080. So does wikipedia have a different standard for what should have a reference? Daniel.Cardenas 04:25, 4 April 2007 (UTC)Reply[reply]

Since a paper on video display is intended for a particular audience, its author can reasonably make assumptions about what the audience should know. Wikipedia, however, is aimed at a general audience, and we cannot assume that anyone reading the article is "reasonably skilled" in video display, or even that they know anything at all about video display. Many, perhaps most, people reading this article would not know how to calculate bandwidth. Fumblebruschi (talk) 21:00, 27 December 2007 (UTC)Reply[reply]

Layman's question[edit]

I'm baffled. I thought 30 fps was the standard for US television, and that the difference between interlaced and progressive was whether the full frame was completed in one pass or in two passes. I'm no techie, so perhaps this could be explained in layman's terms. The discussion above compounds my confusion rather than ending it. Second question, partly related to the first: Here's a quote from the article: "Due to interlacing 1080i has twice the frame-rate but half the resolution of a 1080p signal using the same bandwidth. This is especially useful in sport-shows and other shows with fast-moving action." I'm not sure what "this," the first word in the second sentence, refers to. It seems to me to refer to the subject of the first sentence, 1080i, which means that 1080i is better for shows with fast-moving subjects. I'm almost sure that's wrong, because of the movement of displayed objects in the picture between the first and second scans of the same frame. But even if it's not wrong, perhaps the pronoun "This" that begins the second sentence should be changed to a noun (The 1080i standard or the 1080p standard) to clear up any confusion.

Many cameras doesn't use the same frame for generating the two frames in interlace. So what if you imagine 50 frames per second each divided in odd and even lines, you get the even lines of the first frame, then the odd lines of the second frame, then the even lines of the third frame, etc. This is also what makes de-interlacing so incredibly hard, because you can not just recombine the two frames to get the original. Carewolf 13:21, 2 August 2007 (UTC)Reply[reply]
Answer to "baffled". Interlaced (1080i) transmits 30 frames per second, ½ of each frame every 1/60th of a second. Progressive (720p) transmits 60 complete frames per second. On flat-screen HDTV sets (LCD or plasma), all broadcasts are displayed as progressive; thus the two parts of the 1080i frame must be combined (de-interlaced); some HDTV sets perform de-interlacing better than others. 1080p is not broadcast; it has twice the frame rate (60/second) of 1080i (30/sec). Many believe that 720p at 60 frames/sec is superior for fast-moving action (it's used by ESPN and ABC). -Dawn McGatney 69.139.231.9 (talk) 08:29, 30 March 2008 (UTC)Reply[reply]

Contradiction in frame rates.[edit]

The first paragraph speaks of frame rates of 25 and 30Hz for 1080i; yet the comparison table says 1080i is 50 or 60Hz and 1080p is 24, 25 or 30HZ. This appears to be a contradiction. —Preceding unsigned comment added by KX36 (talkcontribs) 06:15, 16 August 2007

This is not a contradiction; it's just from the standards that the Consumer Electronics Association have adopted. 1080i content exists only in 50 and 60 field per second varieties. 1080p exists in 24, 25 and 30 frames per second. There is no analogous format for 1080i at 48 fields per second. Traditionally, 1080p24 content will be telecinc'd to 1080i60 or sped up to 1080p25 and interlaced, making 1080i50. — Preceding unsigned comment added by 206.248.184.215 (talk) 21:38, 22 September 2012 (UTC)Reply[reply]

1080iN notation - Is N frame rate or field rate?[edit]

This is unclear in the article.

"1080p60", has no ambiguity, (60 full frames per second), but "1080i60" might be interpreted as 60 two-field-frames per second, or it might be interpreted as 60 fields per second.

Can we have a section establishing the standard interpretation? Also we may need to clarify or avoid all usages of "N frames per second". fields or full-frames?

Glueball 10:56, 10 September 2007 (UTC)Reply[reply]

Until now I had always seen that number meaning fields per second for interlaced modes, so that PAL-B/G is expressed as 576i50. But I'll have to check. --150.241.250.3 07:31, 18 September 2007 (UTC)Reply[reply]

The number following "p" or "i" is frames (complete pictures) per second. 1080i is short for 1080i30; 720p is short for 720p60. (And 480i (SD) is short for 480i30.) -Dawn McGatney 69.139.231.9 (talk) 08:38, 30 March 2008 (UTC)Reply[reply]
No, it's fields. You'd be hard pressed to find a single instance of the number after the 'i' (except possibly in the EBU notation - I'd like a citation because a quick Google shows around 8x the number of references to 1080i/60 as 1080i/30) where a 60Hz 1080 interlaced signal is referred to as "1080i30". On manufacturer's data sheets, it's 1080i60. On reviews it's 1080i60. On "Dummy's guide to HDTV" type articles, it's 1080i60. I'm a little baffled that Wikipedia is bucking the trend here, as I've come across a number of Wikipedia articles (and only WP articles) using the "frames" number, and no citations to back it up.
I'll await clarification on the EBU notation before changing that, but for now if there's no slash, the number after the "i" should be fields, otherwise Wikipedia's going to be out of step with pretty much the entire world on this! --66.149.58.8 (talk) 11:44, 11 July 2009 (UTC)Reply[reply]
The definition of the EBU nomenclature is found in the article's first external link, "High Definition (HD) Image Formats for Television Production" (EBU - Tech 3299), under the heading "Nomenclatures and Image Sampling Systems". It says "samples horiz. x active lines/Scanning/frame rate" with an abbreviated style without the horizontal samples. Regarding 1080i/60 vs 1080i/30, a month later EBU Technical Review Editorial No. 301 says that "the convention used to describe TV formats is the number of active lines per frame + the scanning algorithm [interlace(i) or progressive (p)] / the frame rate" while using the notation 1080i/30 in the article itself. Lacking an authoritative source, I find it hard to believe that notations in the form of 1080i30 would be anything more than derivatives of these. I believe the de facto source of the EBU style notation is to be found in the ITU publications sourced in the EBU - Tech 3299 document under the heading "Informative References". 212.246.213.38 (talk) 18:51, 23 October 2009 (UTC)Reply[reply]

540p?[edit]

Why does 540p redirect here, its clearly not the same. 83.108.208.28 (talk) 00:22, 9 July 2009 (UTC)Reply[reply]

Same question, almost a year later. Why does nobody fix things around here? --77.109.214.213 (talk) 11:19, 10 May 2010 (UTC)Reply[reply]
Because 1080i is made up of two 540 interlaced fields. See Display resolution#Current standards. 74.179.40.22 (talk) 18:14, 18 May 2010 (UTC)Reply[reply]
One field is 1920 X 540. 540p is 960 X 540 pixels. Added to article to dodge further confusion. CadetMadet (talk) 01:04, 4 July 2010 (UTC)Reply[reply]
That information was deleted – along with various other material – about two months later (00:58, 28 August 2010 (UTC) by Mikus with the edit explanation "Less bla-bla".) I just put it back (and added a mention of 1440x1080). I suppose that involves "More bla-bla", but I think it is important information. —Mulligatawny (talk) 17:38, 27 April 2011 (UTC)Reply[reply]

Broken citation[edit]

Please fix my broken citation (#3, as of the time of writing). I could not find a way to link the image, which is on Wikimedia Commons, inside the reference. The image is the only real reliable reference, in this case (as explained in my edit comment). Comanoodle (talk) 21:02, 20 September 2009 (UTC)Reply[reply]

1877x1000?[edit]

There is an edit done by an anonymous editor in the fourth paragraph: "1877x1000 (the actual displayed resolution of a 1920x1080 source) resolutions.", in contrast to the original "1920x1080 resolutions". Where in the world have they got that information? To my knowledge, all HDTV resolutions are displayed as such; no cropping is ever done (which I believe the author tries to say, in contrast to scaling). If nobody comments, I'll undo that. Elmo Allen (talk) 04:39, 23 November 2009 (UTC)Reply[reply]

Some televisions apply an artificial "overscan", were the outer 8-10% of the image wouldn't be seen, in the same way that older CRT displays tended to. However this is quite uncommon with modern displays. —Preceding unsigned comment added by 121.98.240.135 (talk) 10:12, 27 April 2011 (UTC)Reply[reply]

Interlaced artefacts[edit]

I think the article needs to be made clearer that interlaced video doesn't have to suffer any artefacts at all, and can produce the exact same results as 1080p broadcasts. The image used in the article is a bit misleading. For example, in the UK programmes are broadcast at 1080i for most HD channels, but the source material is 25fps, and so a 1080p television simply combines the two fields to produce an original progressive frame. You'd never under any circumstances see a combing effect, because the two fields are from the exact same original frame. As the article stands, it gives the impression that 1080i is always visually inferior to 1080p, which is nonsense. Teppic74 (talk) 12:15, 28 September 2011 (UTC)Reply[reply]

I agree with that, I think the image should be removed as it's misleading causing readers to think the combing problem is what interlaced video is supposed to look like, which is not the case.NJM2010 (talk) 12:25, 8 October 2011 (UTC)Reply[reply]

Bandwidth compression degradation[edit]

This article needs much more content!

It only discusses "perfect" content streams -- which is not the real world. What is the full bandwidth of a perfect 1080i stream? What is the real typical bandwidth of OTA broadcasts, cable broadcasts, and Bluray sources? How much of what kinds of compression is used, and how does this degrade various kinds of content? Signal encoding redundancy/error correction/artifacts?-96.237.4.73 (talk) 19:08, 14 February 2013 (UTC)Reply[reply]

lkkkk[edit]

sssss 202.165.91.8 (talk) 15:14, 26 May 2022 (UTC)Reply[reply]

Removing line about Charles Poynton[edit]

Currently, there is a line at the bottom of the lead which says:

The choice of 1080 lines originates with Charles Poynton, who in the early 1990s pushed for "square pixels" to be used in HD video formats.[1]

I have some issues with this. First, the source is Charles Poynton's personal website. In his words, he is "the inventor of the number 1080 found in HDTV standards". There's no independent source, so we are relying on his statements alone. If the claim were more minor, and in his biography article, like "In the early 1990s, Poynton encouraged the use of 1080 lines and square pixels in HD video", I wouldn't have a problem with it. But this line goes beyond a minor claim about himself (WP:ABOUTSELF). It's a claim that he is the originator of 1080 lines (and therefore nobody before him proposed such a thing). That's a little too much to be backed up by his own website alone.

I also have some doubts about the veracity of the claim. According to "European Perspectives on HDTV Studio Production Standards", IEEE Transactions on Broadcasting, Vol. 35, No. 3, September 1989 (doi:10.1109/11.35315), Page 281 and 282 (elipses mine, bolded):

The technical problem is to decide how to line up all parameters other than field rate so as to allow maximum convenience, lowest equipment costs, and maximum quality and converted picture quality. ... An example ... is put forth below ... The system has a common bit-rate and a common image structure (1080 X 1920 elements). The digital pixels are square (1920 X 9/16 = 1080). The use of progressive scanning also gives balanced horizontal and vertical resolution (or square analogue pixels).

So, these authors (N. Wassiczek, G.T. Waters, D. Wood), had proposed square pixels and 1080 vertical lines as early as September 1989, earlier than "the early 1990s" from the line.

Here's another example. From Future Development of HDTV, CCIR Report BT.1217, page 180:

[CCIR, 1986-90n] suggests that an image frame with 2 250 000 samples will simplify the task of ensuring compatibility with Recommendation 601 and proposes a common image format based on 1080 active scanning lines per frame. This number of scanning lines is derived from a pixel aspect ratio of 1:1.

If you scroll up, you will see two diagrams also referencing 1920 samples and 1080 lines. I wasn't able to find CCIR 1986-90n itself, but judging from the name, I figure it came out between 1986 and 1990. That still seems too early for him to originate 1080 lines in "the early 1990s", especially since the CCIR 1990 conference ended on 1 June 1990. My speculation is that Mr. Poynton was one of multiple people to come up with the same numbers (by modifying NHK/Sony's de-facto standard, 1920 x 1035, to have square pixels). He certainly may have generated support for 1080 lines as a standard. But as it stands, there certainly aren't enough reliable sources to call him the originator/inventor of that number.

References

  1. ^ Poynton, Charles. "Charles Poynton – square pixels and 1080". Retrieved 2013-02-21.

HenryMP02 (talk) 03:58, 3 October 2022 (UTC)Reply[reply]

HenryMP02 – When questioning the authenticity of claims about science and technology, it's customary to assume good faith in the absence of evidence to the contrary. I’d be happy to talk with you about about the history of square sampling in HDTV, and clarify my contributions: +1 416 535 7187, any reasonable Toronto time. You may wish to review the 9-minute video at <https://poynton.ca/notes/misc/Poynton-square-pixels.html> for background. If I don’t hear from you in a week or two, I’ll post a summary here for the record. In brief, yes, I did it, and SMPTE recognized me for that work by awarding me the David Sarnoff Gold Medal in 1993. Cheers, – Charles Cpoynton (talk) 20:39, 31 August 2023 (UTC)Reply[reply]

Sequence of the fields[edit]

The article incorrectly documented that the odd lines are stored in the first field and then the even lines are stored in the second. This is not correct, but to understand why requires a bit of history.

When interlaced analogue television was developed with the introduction of the 405-line system in 1936, the signal was transmitted as a sequence of odd and even fields. Which was first was technically unimportant, they just occurred odd-even-odd-even etc. etc. There was no difference other than timing changes. Just like a broken line in the middle of the road, the line follows the spaces which follow the line which follows the spaces etc. etc.

The specification for the 405-line system specified that the odd field occurred first in the complete video frame, but there was no technical reason why it should do so. It was purely a matter of convention. In fact (with one notable exception) all the specifications for all interlaced analogue video systems were written assuming the odd field came first.[1] They could have specified even field with no change to anything. The one exception was the US National Television System Committee (NTSC). For some reason that is unlikely to become clear, although the makeup of the video signal was, more or less identical apart from timings, they decided to specify the even frame as being first. As already said, this made absolutely no difference - at least in the analogue world.

However, once digital video formats took off, it did make a difference. Digital video formats encode a complete frame as a distinct video unit (to facilitate compression among other reasons). Because they were originally created for NTSC standard signals, the digital video frame follows the NTSC standard and has the even field first. This survived into the DVD video format, and ultimately into the 1080i digital formats. This even first sequence was preserved when digital formats were copy-pasted for the 625-line analogue systems.

This created an awkward problem though it should not have done. Since 625-line video is specified as odd field first, someone decided that it was necessary to switch the field order to match the digital format's requirement meaning that it had to be switched back on replay (because the digital encoding switches the field order so that the even field which occurred after the odd timewise now comes first giving a time 'hiccup'). The article contains an image of what happens when the digital deinterlacing does not work well. The image could equally illustrate what happens when field order is switched incorrectly (actually a frequent problem with on-line video that was created with analogue systems and has been converted to non-interlaced digital video without switching the field order - the fields play out of sequence giving a similar comb effect).

The unfortunate part was that this was all unnecessary because the digital encoding could have simply used the even field from the previous frame which would have completely obviated the issue.

References

  1. ^ "Characteristics of monochrome television systems (report ref 308-20 published by the Comité Consultatif International pour la Radio (CCIR, a forerunner of the International Telecommunication Union - ITU))

86.162.147.159 (talk) 17:03, 5 October 2022 (UTC)Reply[reply]