Tuesday, March 18, 2008

Engineering the 21st Century

An international group of leading technological thinkers, including IBN Executive Director Prof. Jackie Y. Ying, have identified 14 Grand Challenges for Engineering in the 21st Century that would significantly improve the quality of life if they are met.

The 18-member blue-ribbon committee was appointed by the National Academy of Engineering (NAE) at the request of the US National Science Foundation in 2006 and sought input from people around the world for their report. Its findings were revealed by the NAE at the 2008 Annual Meeting of the American Association for the Advancement of Science in Boston on February 15, 2008.

According to the Committee Chair and former US Secretary of Defense, Prof. William J. Perry, the committee selected engineering challenges that need to be met and could be realized in the next century. The committee focused on identifying 'what needs to be done to help people and the planet thrive'. These challenges have been classified into four themes to address issues related to sustainability, health, vulnerability and joy of living.

The grand challenges identified by the committee are:

  • Make solar energy economical
  • Provide energy from fusion
  • Develop carbon sequestration methods
  • Manage the nitrogen cycle
  • Provide access to clean water
  • Restore and improve urban infrastructure
  • Advance health informatics
  • Engineer better medicines
  • Reverse-engineer the brain
  • Prevent nuclear terror
  • Secure cyberspace
  • Enhance virtual reality
  • Advance personalized learning
  • Engineer the tools of scientific discovery

"These grand challenges represent multifaceted problems that would critically impact our future, and would require attention from different walks of our society and joint efforts between nations. For example, global warming poses a direct threat on the sustainability of our civilization. Therefore, the control of green house gas release, along with the development of carbon sequestration methods and alternative energy sources/utilization require our immediate attention. These challenges can only be tackled through greater cooperation between various groups, including engineers, scientists, policy makers and industrial leaders," said Prof. Jackie Ying.

"In the area of health, personalized medicine presents exciting opportunities for tailored treatments of an individual's illness, but it can only be accomplished if scientists, engineers and medical doctors all work together. Here, engineers can contribute by making processes such as drug screening and synthesis much more efficient so that personalized medicine can be realized at an affordable price. It is also important that bioengineers work on regenerative medicine with novel cell and tissue engineering approaches to develop alternative, more biomimicking means of recovery from diseases."

"Many of the challenges also have broad impacts in different fields. For example, technological advances in virtual reality would dramatically transform communications and media, and also allow us to drastically cut down energy utilization associated with physical transportation. Breakthroughs in reverse engineering the brain would enable us to harness the full potential of the human brain, revolutionizing the way we learn, think, work and experience life; it would also lead to breakthroughs in the way we develop artificial intelligence to relieve us from hazardous, tedious and technically challenging tasks. Along with personalized learning, reverse engineering the brain may pave the way for creating new methods of educating our younger generation in a much more fun and effective manner. These new challenges represent areas of research that may profoundly impact our joy of living, helping us become more productive and achieve a healthier work-life balance."

"Other key challenges facing engineers involve developing novel technologies that not only would improve the quality of life in the developed countries, but also can be made easily available to the entire world. For example, miniaturized and portable devices that can purify water, generate energy, facilitate education, and enable communications may allow us to move away from the need of mega infrastructural developments towards compact, distributed systems that can be more easily acquired and more effectively managed/upgraded. Nanotechnology holds the key to such disruptive technology. It can make a tremendous impact especially to the people living in the rural parts of the world and in the developing countries, and hopefully reduce the disparity between the "haves" and have-nots," commented Prof. Ying.

The NAE Grand Challenges for Engineering Committee comprises of:

  • William Perry, Committee Chair, (Former Secretary of Defense, U.S. Department of Defense) Michael and Barbara Berberian Professor and Professor of Engineering, Stanford University
  • Alec Broers, Chairman, Science and Technology Select Committee, United Kingdom House of Lords
  • Farouk El-Baz, Research Professor and Director, Center for Remote Sensing, Boston University
  • Wesley Harris, Department Head and Charles Stark Draper Professor of Aeronautics and Astronautics, Massachusetts Institute of Technology
  • Bernadine Healy, Health Editor and Columnist, U.S. News & World Report
  • W. Daniel Hillis, Chairman and Co-Founder, Applied Minds, Inc.
  • Calestous Juma, Professor of the Practice of International Development, Harvard University
  • Dean Kamen, Founder and President, DEKA Research and Development Corp.
  • Raymond Kurzweil, Chairman and Chief Executive Officer, Kurzweil Technologies, Inc.
  • Robert Langer, Institute Professor, Massachusetts Institute of Technology
  • Jaime Lerner, Architect and Urban Planner, Instituto Jaime Lerner
  • Bindu Lohani, Director General and Chief Compliance Officer, Asian Development Bank
  • Jane Lubchenco, Wayne and Gladys Valley Professor of Marine Biology and Distinguished Professor of Zoology, Oregon State University
  • Mario MolĂ­na, Professor of Chemistry and Biochemistry, University of California
  • Larry Page, Co-Founder and President of Products, Google, Inc.
  • Robert Socolow, Professor of Mechanical and Aerospace Engineering, Princeton University Environmental Institute
  • J. Craig Venter, President, The J. Craig Venter Institute
  • Jackie Y. Ying, Executive Director, Institute of Bioengineering and Nanotechnology
View the full report at www.engineeringchallenges.org.

http://www.ibn.a-star.edu.sg/news_events_article.php?articleid=139

The Bioengineering of the Future

What is Bioengineering?

Bioengineering is what you do with biotechnology. Biotechnology is what you get from studying and learning how to manipulate biology. It has been known for more than a hundred years that life is basically a machine which takes in fuel and performs work. This machine is remarkable in that it arose through variation and selection from a very primitive self replicating molecule. The closest analogue of this molecule present in our contemporary biology is the ribosome. It has been found that this ribosome consists, in part, of a strand of RNA. Inside the nucleus of a human cell there are twenty two pairs of chromosomes and either another pair or two different. These, as you are probably tired of hearing, are mostly DNA. In our cells is also an organelle which is called a mitochondria. This organelle was, once, a separate life-form. It evolved separately from the type of cells that make up most of our body and its internal structure is significantly different from the cell it is now a part of. These mitochondria have their own DNA but have come to live within our cells as symbiotes. These scientific findings will underlie the rest of this essay.

What is the significance of DNA? It's just a molecule, right? Yes it is. That is exactly what it is. It has no magical properties whatsoever. Science fiction notions of being able to mutate an adult human by changing their DNA to that of another species, perhaps, would not have any morphological change on the person because there is not, yet at least, a gene that can reshape bone. Most random gene changes result in some kind of disease such as cancer. This should seem weird to you. I am talking about changing a DNA molecule but I'm not suggesting that it becomes anything other than a DNA molecule. I mean changing your garden variety molecule by changing even a single bond usually makes something totally different. What is DNA that it is different? For one thing DNA is actually a family of molecules composed of adenine, thymine, cystine, and guinine. RNA uses a molecule called uracil in place of thymine. A strand of DNA is a linked chain of any number of any combination of those four "base pairs". When we talk about modifying DNA we are talking about rearranging sequences of base pairs.

The DNA molecule serves a function no different from the hard drive on your computer. It stores information. The three billion year history of the DNA molecule is a testament to its effectiveness as a storage medium. RNA is more like your computer's working memory. It is natural to consider the act of designing a DNA strand in the same way we would consider the act of writing a program for a computer. A bioengineer, or as it was put in Blade Runner, a genetic designer, is a DNA programmer. I am trying to express these ideas in these terms because a certain mysticism has arisen around DNA that needs to be dispelled before we can consider the questions of what we might be able, or want, to do with our newfound ability to manipulate it.

http://home.mchsi.com/~deering9/botf.html

Prognosis Positive for Biomedical Engineering Grads

The employment outlook for biomedical engineering graduates is, in a word, good. So say three professors who are tops in the field, from Northwestern University in Evanston, Ill., to Clemson University in South Carolina.

"The outlook is good and getting better as employers recognize the value of the specialty of biomedical engineering," notes Dr. Scott Delp, associate professor of biomedical engineering and rehabilitation at Northwestern University, as well as a research scientist at the Rehabilitation Institute of Chicago. "The more biomedical engineers who go out into industry, the more I see that trend continuing."

Currently, Delp estimates that about half of Northwestern's biomedical engineering undergrads go on to medical school, while 25 percent head to grad school and the remaining 25 percent, roughly 20 students, go on to jobs in industry right out of college.

"Biomedical engineers have unique skills," Delp says. "Often they are needed to bridge traditional engineering skills with medical applications. For someone to have a formal education in both disciplines is very helpful."

Delp asserts that the U.S. dominates the world in the healthcare marketplace, which translates into an optimistic view of the future for his field.

"We have a strong export/import balance," he says. "The growth of the healthcare industry and the domination of the U.S. healthcare industry worldwide are strong indicators that biomedical engineers will be doing well in the coming years."

As vast as the field is, all areas of biomedical engineering represent good employment prospects for today's graduates, according to Dr. Larry Dooley, professor of bioengineering and director of the School of Chemical University. Dooley notes that both the medical device marketplace and the diagnostics marketplace are expanding in the U.S. in terms of new production capability. That translates into a wealth of opportunities for grads possessing bioengineering skills.

http://www.graduatingengineer.com/futuredisc/biomed.html

Paper predicts bioengineering future

Over the next 25 years, the development of more sophisticated biomedical devices will revolutionize the diagnosis and treatment of conditions ranging from osteoarthritis to Alzheimer's disease, according to MIT professors in an article in the February 7 issue of the Journal of the American Medical Association (JAMA).

The MIT article was also one of three from the issue to be featured at a February 6 media briefing on "Opportunities for Medical Research in the 21st Century." It was selected from among 24 in the special theme issue.

In the article, Associate Professor Linda G. Griffith and Professor Alan J. Grodzinsky explored the recent history of biomedical engineering and made projections for the future of field. Dr. Griffith presented the article at the New York media briefing, which was a collaboration between the Albert and Mary Lasker Foundation and JAMA.

"The most visible contributions of biomedical engineering to [current] clinical practice involve instrumentation for diagnosis, therapy and rehabilitation," wrote Professors Griffith and Grodzinsky. Dr. Griffith has positions in the Department of Chemical Engineering and the Division of Bioengineering and Environmental Health (BEH). Dr. Grodzinsky is director of the Center for Biomedical Engineering and a professor in the Department of Electrical Engineering and Computer Science and the Department of Mechanical Engineering.

Biomedical engineering is broadly defined as the application of engineering principles to problems in clinical medicine and surgery. A revolution in disease diagnosis began in the 1970s with the introduction of computerized tomography, magnetic resonance imaging and ultrasonic imaging.

The field also has been responsible for the development of new therapeutic devices such as the cochlear implant, which has helped many hearing-impaired people in the United States experience dramatic improvement. Cardiovascular therapy also has been changed by the introduction of life-saving implantable defibrillators in the 1980s. In addition, vascular stent technology for the treatment of aneurysms, peripheral vascular disease and coronary artery disease has made it possible for minimally invasive procedures to replace major surgery.

"Cell and tissue engineering also has emerged as a clinical reality," the authors wrote. "Products for skin replacement are in clinical use and progress has been made in developing technologies for repair of cartilage, bone, liver, kidney, skeletal muscle, blood vessels, the nervous system and urological disorders."

At the same time, biomedical engineering is undergoing a major ideological change. "The fusion of engineering with molecular cell biology is pushing the evolution of a new engineering discipline termed 'bioengineering' to tackle the challenges of molecular and genomic medicine," the authors wrote. "In much the same way that the iron lung (an engineered device) was rendered obsolete by the polio vaccine (molecular medicine), many of the device-based and instrumentation-based therapies in clinical use today will likely be replaced by molecular- and cellular-based therapies during the next 25 years."

Professors Griffith and Grodzinsky expect to see continued growth and development in the field of biomedical engineering, resulting in new diagnostic and treatment options for patients. "In the next 25 years, advances in electronics, optics, materials and miniaturization will push development of more sophisticated devices for diagnosis and therapy, such as imaging and virtual surgery," they wrote.

They suggest that the new field of bioengineering will give rise to a new era of "lab on a chip" diagnostics, enabling routine and sensitive analysis of thousands of molecules simultaneously from a single sample.

"A potentially even greater impact of bioengineering will result from the increased ability to incorporate molecular-level information into complex models. The result will be a revolution in diagnosis and treatment of diseases ranging from osteoarthritis to Alzheimer disease," they wrote.

"Either by looking for single-signature molecules (e.g., cancer antigens) or by using appropriate algorithms to derive relationships between many interacting molecules, early prediction of onset of disease may be possible," they continued. "For example, osteoarthritis might be detected just when cartilage degradation begins and before damage is irreversible; Alzheimer's disease might be detected in early adulthood when it is believed lesions might first form and before cognitive decline."

"In each case, new drugs developed with the aid of molecular and cellular engineering will likely be available to combat disease progression," they concluded. "For osteoarthritis, these advances would obviate the need for joint replacement surgery... For Alzheimer's disease, which lacks current therapeutic options, the impact of bioengineering will be extraordinary."

http://web.mit.edu/newsoffice/2001/biomedical-0214.html

Human Gene Therapy: Present and Future

James M. Wilson
Institute for Human Gene Therapy, University of Pennsylvania

In his presentation at the 1998 Cambridge meeting, James Wilson characterized gene therapy as a novel approach in its very early stages. Its purpose, he said, is to change the expression of some genes in an attempt to treat, cure, or ultimately prevent disease. Current gene therapy is primarily experiment based, with a few early human clinical trials under way.

Theoretically, he continued, gene therapy can be targeted to somatic (body) or germ (egg and sperm) cells. In somatic gene therapy the recipient's genome is changed, but the change is not passed along to the next generation. This form of gene therapy is contrasted with germline gene therapy, in which a goal is to pass the change on to offspring. Germline gene therapy is not being actively investigated, at least in larger animals and humans, although a lot of discussion is being conducted about its value and desirability.

Gene therapy should not be confused with cloning, which has been in the news so much in the past year, Wilson continued. Cloning, which is creating another individual with essentially the same genetic makeup, is very different from gene therapy.

Listing three scientific hurdles in gene therapy, Wilson emphasized the concept of vehicles called vectors (gene carriers) to deliver therapeutic genes to the patients' cells. Once the gene is in the cell, it needs to operate correctly. Patients' bodies may reject treatments, and, finally, there is the need to regulate gene expression. Wilson expressed optimism that many groups are making headway and cooperating to overcome all these obstacles.

Viruses have evolved a way of encapsulating and delivering their genes to human cells in a pathogenic manner. Scientists have tried to take advantage of the virus's biology and manipulate its genome to remove the disease-causing genes and insert therapeutic genes. These gene-delivery vehicles will make this field a reality, he said.

In the mid-1980s, the focus of gene therapy was entirely on treating diseases caused by such single-gene defects as hemophilia, Duchenne's muscular dystrophy, and sickle cell anemia. In the late 1980s and early 1990s, the concept of gene therapy expanded into a number of acquired diseases. When human testing of first-generation vectors began in 1990, scientists learned that the vectors didn't transfer genes efficiently and that they were not sufficiently weakened. Expression and use of the therapeutic genes did not last very long.

In 1995, Wilson continued, a public debate led to the consensus that gene therapy has value although many unanswered questions require continued basic research. As the field has matured over the last decade, it has caught the attention of the biopharmaceutical industry, which has begun to sort out its own role in gene therapy. This is critical because ultimately this industry will bring gene therapies to large patient populations.

Wilson reviewed several specific gene-therapy cases involving high cholesterol, hemophilia, and cystic fibrosis. He emphasized that the response to any therapy in a heterogeneous patient population will be quite variable.

He asked the audience to think about gene therapy, not necessarily to treat genetic disease but as an alternative way to deliver proteins. Protein therapeutics currently are manufactured by placing genes in laboratory-cultured organisms that produce the proteins coded by those genes. Examples of such manufactured proteins include insulin, growth hormone, and erythropoietin, all of which must be injected frequently into the patient.

Recent gene therapy approaches promise to avoid these repeated injections, which can be painful, impractical, and extremely expensive. One method uses a new vector called adeno-associated virus, an organism that causes no known disease and doesn't trigger patient immune response. The vector takes up residence in the cells, which then express the corrected gene to manufacture the protein. In hemophilia treatments, for example, a gene-carrying vector could be injected into a muscle, prompting the muscle cells to produce Factor IX and thus prevent bleeding. This method would end the need for injections of Factor IX --a derivative of pooled blood products and a potential source of HIV and hepatitis infection. In studies by Wilson and Kathy High (University of Pennsylvania), patients have not needed Factor IX injections for more than a year. In gene therapies such as those described above, the introduced gene is always "on" so the protein is always being expressed, possibly even in instances when it isn't needed. Wilson described a newer permutation in which the vector contains both the protein-producing gene and a type of molecular rheostat that would react to a pill to regulate gene expression. This may prove to be one of gene therapy's most useful applications as scientists begin to consider it in many other contexts, he said. Wilson's group is conducting experiments with ARIAD Pharmaceuticals to study the modulation of gene expression.

Wilson stated that only so much can be done in academia and that the biopharmaceutical industry has to embrace gene therapy and handle issues of patents, regulatory affairs, and the optimum business model. An example of a dilemma that society may be facing can be seen in the treatment of hemophilia. Infusing a patient with the replacement protein, which stops bleeding episodes but doesn't prevent them, currently costs about $80,000 a year. Why would a vector to prevent bleeding for 5 to 10 years be commercialized when it would displace such a lucrative treatment, and how would this gene therapy be delivered to the public?

Wilson concluded his presentation by outlining future milestones in the field: proof of concept in the next few years in model inherited diseases, followed by cancer and cardiovascular diseases; continued explosive activity in technological development; development of regulatory policy (with the Food and Drug Administration); and commercial development.

http://www.ornl.gov/sci/techresources/Human_Genome/publicat/hgn/v10n1/15wilson.shtml

Our future genes - genetic research

Q: What's so important about genetics?

Spelsberg: Genes control all the physical traits you inherit. They instruct your cells to make proteins that determine everything from the color of your eyes to whether you'll be at risk for--or get--certain diseases.

Today scientists are unlocking the blueprint of genes at their most basic level. The Human Genome Project involves researchers around the world. They hope to identify and interpret (decode) the roughly 100,000 genes that tell the story of human heredity. By locating and adjusting genetic material, they may possibly prevent or alter the course of certain diseases.

Q: What are some potential uses for genetic engineering?

Spelsberg: Scientists hope to insert material to erase or replace "bad" genes, such as those that lead to colon cancer. Or doctors might enhance the capabilities, maybe even the number, of "good" genes. Good genes might strengthen your immune system or keep your blood vessels from clogging.

If you have diabetes, for example, new genetic material might encourage production of insulin. If you have a kidney transplant, genetic material could halt the organ rejection process.

Genetic engineering may make you more receptive to chemotherapy or better able to resist hepatitis or the AIDS virus.

Genetic engineering may enable scientists to correct genetic defects, thereby preventing ailments from being passed on to one's descendants.

http://findarticles.com/p/articles/mi_m0826/is_n4_v10/ai_15504380

BFI in practice: games

he BFI in practice
We might be afraid of two things due to the introduction of this innovation:

- a significant loss of brightness
- variations in the homogeneity of brightness.

Unfortunately, our fears were confirmed.

One good point, however, is that the afterglow noticeably diminished once the BFI was activated and the improvement was obvious. We estimate that it was cut in half, however, we are also sure that some of you won't activate it in games. Why? Because the screening at 60 HZ might be too disturbing for some. In fact, you end up having to choose between the lesser of two evils. Is it better to have a slightly blurred image or rendering that twinkles? After a few days of gaming, we found that this depends on the game. Some have rather homogenous images (strategic games, for example) and the screening of the black area could be disturbing. In this case, we preferred playing without BFI and fortunately, this is where afterglow is the least disturbing. We don't need perfect reaction time for this type of game and we can make do with a slight blur of the image when units are moving.


Then there are FPS, in which the character is always moving. Here, we weren't disturbed by the neon switching on and off. However, if you are a "super demanding" user, you won't be able to play with the BFI activated and you will regret buying this monitor. Don't concentrate on this point too much. Play your games and then adapt the monitor to your use, perhaps adjusting the BFI intensity (three levels are available in the OSD) according to your perception after one hour of gaming.

Reaction time test
A car moves from left to right at high speed.

Movement isn’t perfectly fluid. Depending on its speed, the car is shown in several successive positions. If the car goes very fast, the positions are very close and the eye perceives a flowing movement.
Perfect monitor
monitor with 3 ghost images

A monitor without ghosting effects would have previous images completely fading away when a new one appears. This is the theory and in practice, it's often not the case as images fade progressively. Sometimes up to 5 afterglow images remain on the monitor and represent the visible white trail behind objects. Some monitors have strong overdrives in addition to image anticipation algorithms. In this case, an image can appear in front of the main object, creating a white halo ahead of objects in motion.

With CRTs we captured afterglow with a camera at a shutter speed of 1/60 seconds as compared to 1/1000 s for an LCD. We take 50 pictures per test. We then can see a monitor’s ghosting effects, or all the car’s position in the entire process. The most important image is the one on the left, the better one. It will be the most displayed on the monitor, while the one on the right is in transition.

Here are the two extreme states with each monitor as afterglow oscillates.

BenQ FP241WZ, PVA 6 ms with BFI activated at 3

BenQ FP241WZ et FP241W, PVA 6 ms wihtout BFI.

Dell 2407WFP, S-PVA 6 ms


Afterglow in pictures is unchanged in all three monitors. This proves that the BFI doesn't have an impact on the reaction time of pixels. The BFI has an impact on our eyes, more specifically, on retinal persistence.



http://www.behardware.com/art/imprimer/646/



HDTV Displays

To realize the maximum potential of HDTV, the display must be fully HD compatible. For most users, this represents the biggest challenge and largest expense in their HD migration budget. Picking the right HD display relies heavily on personal taste, while prices vary greatly from under $1000 to many thousands of dollars. Here are a few key points to look for in order to not only insure that your display is HDTV compatible, but more importantly, HDTV optimized :

Wide screen: Your display should be capable of displaying a widescreen (16:9) HDTV image. It should also be able to display a standard (4:3) SDTV image as well.

Resolution: Your display should have enough resolution to faithfully display an HDTV image. For many displays, this means a native resolution (the display's intrinsic resolution) of at least 1280 x 720 pixels. Higher native resolution is better, with so called "full HD resolution" being 1920 x 1080 pixels.

Video Inputs: Your display should have a full complement of both analog video inputs (such as composite video, s-video, and component video) for legacy components as well as digital video inputs (such as DVI or HDMI) for new HDTV and EDTV sources.

Note: HDCP support is a must for all HDTV displays. If you buy a display with a DVI input you MUST insure that it supports HDCP; if it doesn't, you may not be able to view HDCP-encrypted source material from cable boxes, etc on that display! If your display has an HDMI input, you're in good shape as the HDMI standard fully supports HDCP.


http://www.audioholics.com/education/display-formats-technology/hdtv-past-present-and-future-part-i-history


Black Frame Insertion

BFI / MPA: what is really going on
This system might be surprising at first if you have never heard of it. This configuration obviously has nothing or not much in common with the BFI described by BenQ in their data sheets. It is however very close to Samsung’s MPA technology.


The Korean manufacturer described it at the last CeBIT and presented a 24" monitor based on a PVA panel. The principle relies on the introduction of an artificial screening by successively shutting down neons located behind the panel. The similarities are striking, aren't they? So what should we think about this VA monitor, so similar to the one developed by Samsung? There are several possibilities. People at the origin of the technical documents could have misinterpreted the information, AU Optronics could have surprisingly changed their opinion, or the third possibility is that this would be an adaptation of Samsung technology at the expense of the one developed by AU, which might have been delayed.

Either way we aren’t talking about a complete MPA, and it includes a second part that is absent from the current BFI (can we still use this name?). Samsung monitors are clocked at 120 Hz. The images received at 60 Hz are duplicated twice for a refreshing frequency of 120 Hz instead of 60 Hz. Another subtle difference is that BenQ uses the time when a segment is black to begin making part of the next image. The objective is that the user won't see more afterglow from the faster rate. Area switched on: image 1. One of the 16 segments is switched off. During this period of 1/60 Hz /16 neons = 1 millisecond, the monitor begins to draw image 2 and erase the previous one. The backlighting is switched on and if reaction time of the liquid crystal was high enough, image 2 would have been drawn and the previous one erased. In fact, this response time of 1 ms is a little bit short regarding the liquid crystals currently implemented in the panel. It can slightly diminish the afterglow sensation due to the higher rate, but it can't erase it completely yet.

Here now is the functioning as we have understood it (we have to say here that no one at BenQ confirmed these points, their official version still involves black frames):

The monitor is divided into 16 horizontal areas of 75 lines, with a CCFL tube behind each area. T is the time measured.

IMG0016082,C,0]
At T=0, the processor shuts down the first area, the highest one, and leaves the 15 others with the image displayed.

At T=1, area 1 is still dark and the processor searches in the monitor’s memory for the piece of the next image, sending the display command to the liquid crystals.

At T=2, the crystals display the image or are finishing the second image, the processor switches on the backlight in area 1, and turns off area 2.


The loop consists of drawing a piece of the image on a dark monitor, drawing the next slice in the dark, turning on, shutting down the area below, and so on. Once the area 16 is drawn, the area 1 is switched on and the next image is drawn.




This is a progressive type of display. Each time 15 of the 16 areas are switched on while one is in the dark. As we have seen above, each area is in the dark for approximately 1ms.

For now, the fraction of second is often too short for the drawing of the next image to be made in the dark. However, it can already minimise afterglow due to the change from one color to another and in intermediary color scales.

Let's take the case of a monitor with a maximum response time of 16ms, with a 8ms rise and 8ms fall time. If we wish to change one area from red to blue, and if the rise and fall were linear, the dot displayed would show the following colors:


In fact the curb followed by each color isn't linear. It looks rather like this:



If we follow the time transition, colors change like this:


Before the dotted line are the colors displayed during the first millisecond. The black square shows a monitor with a rise and fall time of 8 ms, whose colors transitions will happen in the dark. In our example, rotation no longer starts with a pure 100% red, but rather with a 25% red and 25% blue. Afterglow should be reduced.

http://www.behardware.com/art/imprimer/646/

HDTV Past, Present and Future

The History of HDTV and Changing Needs

It has been nearly 80 years since the first public demonstration of television took place in a crowded laboratory in London . Since that time television has advanced from blurry black and white pictures to stunning high-definition images with life-like depth and realism. How were these achievements made? More importantly, what should we expect in the future as we approach television's first centennial?

A Look Back

Today, there are more than 220 million televisions in the United States . These TV sets have their historical roots in technology that was pioneered in the late 1920's and 1930's. While many television milestones took place during these years, it was not until 1949 that sales of new sets really started taking off. In 1953, the NTSC ( National Television System Committee) standard was adopted for the transmission of color television and in 1954, RCA launched the first commercially available color TV. The 1950s also saw the beginning of a shift in television architecture, moving away from vacuum tube chassis to more solid state components.

During the 1960s, the transition from black and white TVs to color was largely completed. Other advancements during this decade included "HiFi" TV sets and widespread popularity of remote control units. In the mid 1970s, the advent of the VCR transformed the television experience with the ability to record and play back videotapes. The 1970s and 1980s were a time when many Americans took down their unsightly TV antennas and replaced them with cable TV boxes, offering a score of channels to flip through. The so called "MTV generation" was born into this era when television was no longer just a family entertainment center, but the center point of personal and generational expression. While it would be unfair to say that the ensuing years have been technologically dormant for television, few would argue that the next major milestone in TV history was the advent of high definition television (HDTV).

The Early Years of HDTV

Modern-day HDTV has its roots in research that was started in Japan by the NHK (Japan Broadcasting Corporation) in 1970. In 1977, the SMPTE (The Society of Motion Picture and Television Engineers) Study Group on High Definition Television was formed. The group published its initial recommendations in 1980, which included, among other things, the definition of wide screen format and 1100-line scanning structure. The first demonstration of HDTV in the United States took place in 1981 and generated a great deal of interest. In 1987, the FCC (Federal Communications Commission) sought advice from the private sector and formed the Advisory Committee on Advanced Television Service. Initially, there were as many as 23 different ATV (Advanced Television) systems proposed to this committee, but by 1990, there were only 9 proposals remaining - all based on analog technology. However, by mid-1991, the leading ATV designs were based on a new all-digital approach. A joint proposal from several companies detailing an all-digital ATV system was given to the FCC in 1995. Following certain changes and compromises, this proposal was approved by the FCC in December, 1996 and became the mandated ATSC (Advanced Television Systems Committee) standard for terrestrial DTV/HDTV broadcasting. You can read more about the history of the ATSC at: http://www.atsc.org/history.html .

[HDTVscreen] HDTV Today

After 35 years of development, high definition television has finally started making inroads in the consumer marketplace. Today, you hear a lot of HD buzzwords like: HD Ready, HD Compatible, Integrated HDTV, etc. To help consumers deal with the mounting tide of HDTV-related questions, the FCC created a consumer website http://www.dtv.gov in October 2004. The FCC has also set a timeline for the conversion from analog to digital television (DTV). As it stands now, analog television will cease in the United States on December 31, 2006, but there are many that believe that the FCC will extend this date until the penetration of DTV has reached 85% or greater in most key markets. Those still using old fashioned analog TVs won't be entirely out of luck; the FCC mandate requires that consumers be able to purchase a converter box so that their older analog sets can receive the new digital signals. While these converter boxes should be affordable, they will only output the same low resolution signals that our TVs currently use. If you want to see true HD, you'll need to spring for a fancy new HDTV set, projector or flat panel display.

What is HDTV?

What is HDTV really all about? What's new, and why is it better? The best way to make a comparison between standard definition television (SDTV) and high definition television (HDTV) is to consider today's popular digital cameras. Some years ago, when digital cameras first appeared on the consumer market, a popular digital camera featured a 1.6 Mega Pixel (Mpix) image sensor. At that time, 1.6 Mpix was considered high quality. Today, one can easily find 5 and 6 Mpix cameras with far better image quality for a lot less money.

In a similar way, HDTV delivers significantly more resolution than SDTV. For example, a 1080i HDTV signal offers about six times the resolution of a conventional 480i SDTV signal. HDTV also features a wider (16:9) aspect ratio format that more closely resembles human peripheral vision than the (4:3) aspect ratio used by conventional TVs in the past. Furthermore, HDTV is based on a system of 3 primary image signal components rather than a single composite signal, thus eliminating the need for signal encoding and decoding processes that can degrade image quality. Perhaps the biggest advantage over the old analog SDTV system is that HDTV is an inherently digital system. If installed properly, digital HDTV can provide the ultimate in pristine image quality, but there are many factors that must be considered, as we shall see.

HDTV in the Home

What do you need to see HDTV? As in any visual system there are three basic components to consider:

HDTV Sources:

When examining possible sources, one must be careful to distinguish between true HDTV sources and those that offer Enhanced Definition (EDTV), the latter being normal (SDTV) video that is scaled or "up-converted" to a higher resolution. While EDTV can certainly provide dramatic picture quality improvements over the original SDTV source, it can never offer the same level of image quality as a true HDTV source.

Examples of true HDTV sources are:

  1. Off-air ATSC receivers using HDTV
  2. Digital cable Set Top Boxes (STB) that offer HDTV service
  3. Digital satellite receivers that offer HDTV service (i.e. DirecTV, DISH Network, et al.)
  4. Windows Media High Definition Video
  5. HD-DVD and Blu-Ray DVD players

Examples of EDTV sources are:

  1. DVD players featuring DVI / HDMI outputs (with built-in HDTV scalars)
  2. Video image processors (scalers)
  3. Digital cable Set Top Boxes (STB) that offer EDTV service
  4. Digital satellite receivers that offer EDTV service (i.e. DirecTV, DISH Network, et al.)

Another important consideration is analog versus digital sources. As previously stated, the ATSC has adopted a digital transmission system for HDTV; however, there are some HDTV sources on the market today that only offer analog outputs. Analog HDTV sources will become an increasingly greater rarity in a world of all-digital HDTV displays. This is especially true because all newer digital systems also employ HDCP (High-Bandwidth Digital Content Protection) to safeguard digital content against illegal pirating. HDCP cannot be implemented in analog systems.

http://www.audioholics.com/education/display-formats-technology/hdtv-past-present-and-future-part-i-history

BenQ FP241WZ First LCD BFI monitor

BenQ FP241WZ, the first BFI monitor… isn't a BFI, a 100 Hz or 120 Hz.
BenQ pulled off an amazing trick. At the CeBit they unveiled a technology called BFI, which stands for Black Frame Insertion. As its name implies, the principle relies on the insertion of a black frame between two colored images to reduce the afterglow due to retinal persistence. The surprise is that this isn't what has been implemented in the BenQ monitor. Our tests (and first of all our eyes), are proof that the technology introduced to the FP241WZ looks very much like a sort of MPA (Motion Picture Acceleration), unveiled by Samsung at the last CeBIT. These two technologies are quite different.

First of all, the BFI denomination isn't really common knowledge yet. In announcements to the general public, BenQ prefers to call it AMA-Z. AMA for them simply corresponds to the overdrive. The Z indicates that the BFI has been implemented. This AMA-Z (in fact the BFI only) is available via the OSD and can be activated on three different levels (0, 1, 2 and 3).The higher it is, the more important the afterglow correction (we will come back on this point later on).

And is it a 100 or 120 Hz? There is some confusion in this area due to documents provided by BenQ. We have to point out that they haven't written anywhere that the monitor works with a frequency of 100 or 120 Hz. The misunderstanding comes from a significant imprecision in the manufacturer’s data sheet. BenQ described the BFI as a mode with 60 frames per second with black frames inserted each time. We could come to the hasty conclusion that the monitor “artificially” increases the frequency from 60 to 120 Hz. This figure would simply represent the alternance between color and black frames. However, this isn't what is happening at all. BenQ doesn't insert a black screen, but rather BFI consists of integrating an artificial screening by successively shutting down the 16 neons located in the back of the panel. It doesn't mean that the monitor is less interesting, but only that the approach and principle are completely different. The BenQ FP241WZ is the first LCD with a screening like we used to have on CRTs. Now the question is: will it have the same reaction time?
A little aside: Samsung did the same thing with 100 Hz
The funny thing is that Samsung, who is at the origin of this other major innovation, the 100Hz, made even more mistakes in their explanation. We read all sort of things about 100Hz even on their website. At times, it added 10 images per second, or it introduced black frames by turning off the backlighting for every other image. However, this isn't the case and as we explained, Samsung’s 100Hz is a real 100 Hz with a processor devoted to the calculation of an intermediate image inserted between the two sent by the video source. For more information, take a look at the test of the LE40M73BD TV.
The screening in pictures
What a surprise! We thought that we would obtain black images and here are two results chosen randomly with the AMA-Z function activated:



If we take dozens of pictures, we in fact see an entire series of intermediate steps, which you can see when they are side by side:


1 - 2 - 3... 16 steps = there are 16 neons in the back of the panel. This corresponds also to what Samsung explained to us about their 24”. We noted that the two neons located at the two most extreme positions are shut down simultaneously. Upon further analysis, we took into consideration that the tubes are shut down successively.


http://www.behardware.com/art/imprimer/646/