Dr. Tze-Chiang Chen receives IEEE honor for technology leadership

“Generation after generation we hit the technology wall. IBMers possess the attitude that we never see the wall as a barrier; we see it as an opportunity for innovation.”
– T.C. Chen
When Dr. Tze-Chiang Chen joined IBM’s Thomas J. Watson Research Center in Yorktown Heights, New York in 1984, the microprocessors ran at 5 MHz with 275,000 transistors per microprocessor. Since that time, Chen has helped lead one of the world’s most advanced silicon chip technology evolutions. Today, thanks to his contributions, microprocessors run at about 4 GHz with 1-2 billion transistors per processor.

On August 20, 2011, the Institute of Electrical and Electronics Engineers (IEEE) honored Chen with the 2011 Ernst Weber Engineering Leadership Recognition award at the IEEE Honors Ceremony in San Francisco, California. The award acknowledges Chen’s exceptional managerial leadership and contributions in the field of silicon chip technologies. These contributions include acting as the senior manager responsible for the announcement of the world’s fastest and smallest 256 megabit DRAM in the IBM/Siemens/Toshiba DRAM Development Alliance.

Chen credits much of his success to IBM executives who gave him “opportunities, one after the other, to conquer technology challenges.” Chen’s drive, enthusiasm, and dedication to his work undoubtedly had a great deal of impact on his success.

“I am very fortunate to be at IBM Research and to have been given technology challenges over all these years,” says Chen, noting that one of the most enjoyable parts of his IBM career has been to work with a number of talented individuals in research and technology.

Chen says he is honored to receive the award and the recognition the award brings to IBM, proving – as he says - that we are a company consistently at the forefront of the technology evolution.

Chen currently manages more than 600 IBM researchers across six global research laboratories. He has driven the research, development and application of silicon microelectronics technology for a variety of IBM products and solutions. This involvement at all levels – from concept to production – has provided him with what he calls “a fortunate opportunity to initiate, participate, and manage.”


IBM's first cognitive computing chips mimic functions of the brain

Today, IBM announced the very first cognitive computing chips, designed to emulate the brain’s abilities for perception, action and cognition. The technology could yield many orders of magnitude less power consumption and space than used in today’s computers, and give computers a sort of "right brain" capability to match their superior calculating abilities. Following Watson, it is yet another example of IBM's quest to build learning systems.

The Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) project is driven from funding by the Defense Advanced Research Projects Agency's (DARPA); and entering Phase 2 of the SyNAPSE project (Phases 0 and 1 are complete), IBM has built two state-of-the-art chips unlike anything produced before. These chips defy the traditional von Neumann architecture, which relies on programs or instructions to complete these tasks. IBM will use these chips as the basis for an architecture with no set programming.

A peek inside IBM's brain lab in San Jose, CA:

SyNAPSE technical project manager Bill Risk next to the "brain wall." Each of the yellow boxes represents one of the cognitive computing chips (256 neurons), and close up you'll see them blinking - these are neurons firing.

IBM researchers Paul Merolla (left) and John Arthur having fun firing up a SyNAPSE demo.

10 things to know about SyNAPSE

1. The brain uses less energy than a 25 watt light bulb and occupies less volume than a 2-liter bottle of soda -- capable of completing complex tasks, while autonomously computing what it needs to, and when, and knowing what information to save and for how long. The brain is the ultimate computer.

A cognitive computing system monitoring the world's oceans could contain a network of sensors and actuators that constantly record and report metrics such as temperature, pressure, wave height, acoustics and ocean tide, and issue tsunami warnings based on its decision making. Similarly, a grocer stocking store shelves could use an instrumented glove that monitors sights, smells, texture and temperature to flag bad or contaminated produce.

2. Today's computers use an architecture that was designed 40 years ago. Without using more power and taking up more space, we simply can't program today's computers to do the tasks that are required to handle the growing mountains of data we are faced with.

3. Cognitive computers emulate the brain’s abilities for sensation, perception, action, interaction and cognition, while integrating and analyzing vast amounts of data from many sources at once: in essence the "right brain" to today's "left brain" computers.

4. These systems won’t be programmed like traditional computers are today. Rather, cognitive computers will learn dynamically through experiences, find correlations, create hypotheses and remember – and learn from – the outcomes, emulating the human brain’s synaptic and structural plasticity (or the brain's ability to re-wire itself over time as it learns and responds to experiences and interactions with its environment.)

5. To accomplish this new kind of system, IBM is combining neuroscience, nanoscience and supercomputing together to rival the function, power and space of the brain.

6. Supercomputing: In November 2009, scientists used an IBM Blue Gene supercomputer to achieve significant advances in large-scale cortical simulation of a cat brain, substantiating the feasibility of a cognitive computing chip.

7. Neuroscience: Last year, scientists here at Almaden uncovered and successfully mapped the largest long-distance network of the monkey brain, which is essential for understanding the brain’s behavior, complexity, dynamics and computation. This discovery gives scientists unprecedented insight into how information travels and is stored across the brain.

8. Nanoscience: The revolutionary new chip that we've unveiled is a building block towards the long-term goal of SyNAPSE; to build a chip system with ten billion neurons and hundred trillion synapses, while consuming merely one kilowatt of power and occupying less than two liters of volume.

9. Computers like this could have a significant impact on virtually every sector of the economy. The application and service possibilities will range from preventing fraud and providing better security, to helping scientists better understand intricate climate changes happening to our planet (see callout text).

10. IBM has assembled a world-class team including collaborators from Cornell University, Columbia University, University of California - Merced and University of Wisconsin - Madison, to work with their scientists from IBM Research sites including Austin, TX, Yorktown Heights, NY, India and Zurich.


Building an inclusive society on the web

In an effort to improve Web accessibility of public services for every Japanese citizen, including the elderly and disabled, the Ministry of Internal Affairs and Communications (MIC) developed Operational Models for Government Agencies and Municipalities: a manual that provides guidance and procedures that ensure accessibility for such tasks as municipality orders for outsourcing Web site development.

To make the manual consistent with the Japan Industrial Standards (JIS) for accessibility (JIS X 8341-3:2010) that was announced in August 2010, the MIC brought together experts across industry, government, and academia to revise the manual, and at the same time, develop a Web accessibility evaluation tool to complement the manual.


The free tool, called miChecker, offers user-friendly features for government agencies and municipalities to comply with JIS X 8341-3:2010. It is based on aDesigner, a Web accessibility assessment tool developed by IBM Research – Tokyo. In addition to checking compliance, aDesigner offers features to allow Web page owners, designers and developers to identify and simulate Web site barriers experienced by the visually impaired.

IBM contributed aDesigner to the Eclipse Foundation as part of the Accessibility Tools Framework (ACTF) – a collection of tools and building blocks developed by IBM.

As part of the MIC's initiative, IBM researchers made modifications to this open source evaluation tool to take into consideration new accessibility guidelines, such as Web Content Accessibility Guidelines 2.0 (WCAG 2.0) and JIS X 8341-3:2010.

In June this year, the MIC made the miChecker available on its Japanese home page to help drive social inclusion and active social participation of Japan's citizens.

To find out more about other accessibility technology developed by IBM Research, click here.


How IBM is digitizing the world's text

The idea of digitizing all books and making them available on electronic libraries can be traced back to 1945, when Dr. Vannevar Bush wrote "As we may think" in the July issue of The Atlantic. His visionary description of an information centric application called "memex" influenced the development of the hypertext concept and the Internet. And projects in the 1970s – such as Michael Hart's Project Gutenberg, and futurist Ray Kurzweil's Optical Character Recognition (OCR) technology – continued the effort toward the digitization of textual information. But while billions of people access the Internet today, full digitization and availability of past textual information is still a work in progress.

Among many current efforts underway, IBM is working with the European Union on project IMPACT (IMProving ACcess to Text) to efficiently produce digital replicas of historically significant texts and making them widely available, editable and searchable online. As part of the project, IBM researchers in Haifa, Israel developed CONCERT (COoperative eNgine for Correction of ExtRacted Text). It automates simple, repetitious operations using an adaptive OCR engine that automatically learns from its text recognition errors.

Digitizing Japanese literature

The diverse nature of the Japanese language poses a serious challenge to digitizing the country's literature. Japanese script is expressed beyond a few dozen standard characters, typical of most other languages. In addition to Japanese syllabary characters – hiragana and katakana – Japanese includes about 10,000 kanji characters (including old characters, variants and 2,136 commonly used characters), and ruby, a small Japanese syllabary character reading aid printed next to a kanji. Not to mention mixed vertical and horizontal texts.

The National Diet of Japan is Japan's bicameral legislature. It is composed of a lower house, called the House of Representatives, and an upper house, called the House of Councilors. Both houses of the Diet are directly elected under a parallel voting system. In addition to passing laws, the Diet is formally responsible for selecting the Prime Minister.

-- wikipedia.com

Last year, IBM researchers in Tokyo combined their Social Accessibility tool with CONCERT to create a full-text digitization system prototype for the National Diet Library (NDL) of Japan. Dr. Makoto Nagao, the director of the National Diet Library, wrote the book "Digital Library" in 1994, in which he analyzed that the digitization of books is the first step towards realizing an ideal electronic library. The next step is to create a system which allows users take full advantage of digitized information.

"The system needs to have capabilities that are close to how we hold and utilize knowledge in our brain," said Dr. Nagao.

In addition to helping Japanese Diet members to perform their duties, the NDL preserves all materials published in Japan as the national cultural heritage, and make them available to the government institutions and the general public. (As part of this effort, NDL also launched the International Library of Children’s Literature in 2000.)

NDL is making recorded academic literature available online to the public, including making them accessible for the visually impaired, and lending the recordings to libraries throughout Japan.

The IBM Research – Tokyo team also developed a full-text digitization system prototype that improves the digitization of Japanese literature printed during and after the Meiji Period (1868 - 1912); improve accessibility for people with disabilities in reading printed text; and facilitate effective searching and viewing of full-text data. The prototype is also designed with an eye toward future international collaboration and standardization of libraries, including the digitization of historically significant literature, broad utilization of books for various academic activities and online searching.

In a matter of years, all of our textual information will be fully digitized in a reusable way.