This year, at the Californian IDF press is being given more attention than before. The amount of information is increasing, so it is getting increasingly difficult to keep it in the dark especially in view of its considerable weight both for the science and the regular consumer. We have got nothing to do but get adapted. The heading of this material includes the topics of four reports made at a briefing for the press a day before opening the forum! The amount of acquired information is so great that is difficult to tell in detail and explain the idea of a given technology in simple terms within a single review, but we'll just follow this way. We'll try to avoid complicated technical details and explanations of various physical processes, otherwise we would have to write a whole book which would be interesting to a narrow circle of specialists only.## Reaching to... molecules (nanotechnologies) Far back in 2003, Intel claimed of readiness to the production of chips manufactured following the 90-nm process technology which are now prevailing on the market. In other words, all is measured just with such values for a long time. To get a more distinct idea of that, we're bringing in a slide where you can visually see the difference in dimensions and evaluate the physical barrier for technology. The yellow frame points to the current level of technologies, and even for this year already it can be moved even lower, which was demonstrated on the slide in the text below. Note that it is quite a little left to reach the physical limit of a planar transistor, and already in 2011 it will be possible to come up close to it. That process will be Intel's process codenamed P1270. It's now high time to ask a simple question - why should we diminish the process technology itself and the transistors? Smaller dimensions give less power consumption, smaller leakage current (less heating), increase the switching (operation) speed of the transistor. But mere decrease in physical dimensions won't let achieve that. As the dimensions get smaller, typical materials of which transistors are made may change their properties. For example, the transistor's gate dielectric which works fine at the 65-nm process technology will start passing current on transition to 45 nm. The only way out is to look for a new material. This slide is already familiar to many. It was demonstrated still two years ago, but it is also topical today. It displays an immense work at looking for new materials for transition to new process technologies. Owing to quite understandable reasons, not all names of the chemical materials can be announced. Therefore, the table shows code names "High-k" and "Metal"Here is another interesting slide that displays the progress of development. What to use to interconnect nanotransistors? With nonoconductors. With the conductor diameter 20 nm, it can no longer be created by the "traditional" lithographic technique. But this problem has been solved either. Use of carbon in the manufacture of conductors inside the chip and a special production process allowed attaining 1 nm diameter conductor inside the chip. It's now high time we summed up a bit. According to Paolo Gargini, today's semiconductor technology will live for no more than 15-20 years. During that time, it will substantially evolve. Currently multi-gate transistors and combination of other technologies (sensor, optical, biological etc.) on a single chip are being staked on. It is still hard to say what will follow it. As an assumption, the option of transition to molecular technologies is suggested. However that may be, they promise that Moore's Law will outlive the semiconductor technology, or at least hold for the following 15-20 years.## Heading for platformsSince the time of Centrino emergence, Intel started more actively thinking bigger at the scale of platform rather than its specific components. That became especially evident when all started unanimously talking about the convergence of computing systems and communications. Let's put it straight that currently by the notion of "platform" a well-coordinated (!) operation of several main components of computing systems is implied. In his report, Mr.Spindler introduced the journalists to some new technologies which will be available in the near future. The most interesting is the Intel Active Management Technology. That is software-hardware structure which will let remotely diagnose systems, restore and control no matter whether the operating system is loaded or not. This technology will worthily appeal to system administrators and service desks, and in general it will reduce the time of servicing equipment.The not unknown Vanderpool Technology. looks promising enough.It allows setting up several virtual systems on a single physical machine each having its own copy of operating system, own launched applications etc. This will essentially improve the system stability and scalability. By the way, the question of licenses for MS Windows and their required number has again come up. It still remains unsolved. On the whole, the technology is aimed at the server application and will be a complex of hardware and software. Both the technologies mentioned will be available on the market already this year.As regards the remaining technologies, we are bringing in a conventional roadmap of what to expect this year. ## Saving on batteries However hard Mr. Thakkar was trying to tell as much as possible about the new mobile platform, the conversation all the time ended up in energy saving at each part, which in fact was not contrary to the topic of the report. The platform is already well-known, and we have written about it many times. Having outlined the four vectors of progress (wireless communication, durable battery operation, performance and compact form factor), we are going deeper into the power consumption. This is how Intel sees the energy consumption percentage in a modern notebook PC. The most active consumer is the screen, and developers at Intel have also managed to find ways to it. Intel Display Power Savings Technology 2.0 allows decreasing the screen energy consumption by 15-15%. The solution is hidden in the dynamic control of the backlight brightness. The video chip is permanently analyzing the image, and depending on its color it is able darkening it completely.The chipset has also turned more power-saving. It gives saving on the graphic controller and buses, and most interestingly the optimization has reached the audio system as well. It does a better job of handling hibernation stages, and gives an essential saving in the DVD playing mode.Read next: "IDF 2005, Day One: tuned Chrysler, flight to the stratosphere, and 15 multicore projects".