Google has made a $2 million donation to the Wikimedia Foundation, the group behind the widely used Wikipedia reference site.
Jimmy Wales, a member of the board who helped create Wikipedia, announced the donation in a tweet on Tuesday. He said a formal announcement would come Wednesday.
Google and Wikimedia did indeed make a formal announcement of the grant, saying the funds will support technical infrastructure to support growing traffic and to help make Wikipedia "easier to use and more accessible."
"Wikipedia is one of the greatest triumphs of the Internet. This vast repository of community-generated content is an invaluable resource to anyone who is online," Google co-founder Sergey Brin said in a statement.
Wikimedia is funded primarily by individual donations, the organization said. In its fund-raiser for 2009 and 2010, 240,000 people donated more than $8 million--three quarters of Wikimedia's budget.
Mitch Kapor, a Wikimedia advisory board member, also announced the Google funding.
Wikipedia is a vast source of information posted online and edited by individuals rather than a central authority. Its pages emerge frequently in Google searches. Other Wikimedia projects include the Wikibooks book repository, the Wiktionary dictionary, and the Wikimedia Commons repository of images and other media files.
So far, more than 958 million editing changes have been made to Wikimedia projects, according to the organization's statistics counter.
miercuri, 17 februarie 2010
HP earnings rise 25 percent to $2.3 billion
Hewlett-Packard has beaten expectations for its first fiscal quarter of 2010.
The company reported on Wednesday net revenue of $31.2 billion and earnings of $2.32 billion, or 96 cents per share. That's an 8 percent boost from revenue of $28.8 billion and a 25 percent boost from earnings of $1.86 billion, or 75 cents per share, in the same quarter a year ago.
Wall Street was expecting earnings of $1.06 per share and revenue of $30.01 billion. Analysts who track HP had been anticipating strong gains in the PC and server businesses, which the company delivered. PC shipments grew 26 percent over the past year and HP maintained its lead as the world's largest PC vendor. Server revenue increased 11 percent to $4.4 billion.
In a statement released prior to the company's earnings call with analysts Wednesday afternoon, CEO Mark Hurd was upbeat about the quarter and the company's prospects for the rest of the year.
"HP is well-positioned to outperform the market," he said. "The strength of our portfolio, leaner cost structure and accelerating market momentum give us the confidence to raise our full-year outlook."
On a conference call with analysts, CFO Cathie Lesjak said that based on the strong results so far this year, HP is now expecting better results for the full year. By the end of 2010, the company expects to have gained revenue of $121.50 billion to $122.50 billion, up from previous estimates of $118.0 billion to $119.0 billion. It's also anticipating earnings per share of $4.37 to $4.44 for the year. HP is considered a bellwether and the company's decision to raise its outlook bodes well for the industry.
The particular strong point for HP this quarter was printers, a business that has long been the company's largest source of revenue, but that had been hurting of late. The economy combined with consumers' tendency to print less and share pictures and document via e-mail and the Web was taking its toll on the division. But Hurd said that HP made fixing the printer unit a priority last year, and the results for the first quarter of 2010 showed improvement.
Consumer printer shipments grew 18 percent, while commercial printer shipments increased 11 percent compared to a year ago. Hurd said the shipments of the company's relatively new wireless printers doubled, and ink products tripled.
The PC market, which began to pick up at the end of 2009, should also continue to improve, Hurd said.
"We saw a pretty strong consumer demand. We expect that to continue in the first half of the year," he said. "We do expect a corporate refresh in the back half of the year."
Besides ink and printers and PCs, Hurd says his company will focus efforts on growing its sales staff.
"HP's salesforce is 50 percent larger than when I joined the company and we want to increase that number," he said Wednesday.
He said that the company is currently hiring for its direct sales and channel partners in growing international markets.
HP stock rose 48 cents to $50.57 in after-hours trading.
The company reported on Wednesday net revenue of $31.2 billion and earnings of $2.32 billion, or 96 cents per share. That's an 8 percent boost from revenue of $28.8 billion and a 25 percent boost from earnings of $1.86 billion, or 75 cents per share, in the same quarter a year ago.
Wall Street was expecting earnings of $1.06 per share and revenue of $30.01 billion. Analysts who track HP had been anticipating strong gains in the PC and server businesses, which the company delivered. PC shipments grew 26 percent over the past year and HP maintained its lead as the world's largest PC vendor. Server revenue increased 11 percent to $4.4 billion.
In a statement released prior to the company's earnings call with analysts Wednesday afternoon, CEO Mark Hurd was upbeat about the quarter and the company's prospects for the rest of the year.
"HP is well-positioned to outperform the market," he said. "The strength of our portfolio, leaner cost structure and accelerating market momentum give us the confidence to raise our full-year outlook."
On a conference call with analysts, CFO Cathie Lesjak said that based on the strong results so far this year, HP is now expecting better results for the full year. By the end of 2010, the company expects to have gained revenue of $121.50 billion to $122.50 billion, up from previous estimates of $118.0 billion to $119.0 billion. It's also anticipating earnings per share of $4.37 to $4.44 for the year. HP is considered a bellwether and the company's decision to raise its outlook bodes well for the industry.
The particular strong point for HP this quarter was printers, a business that has long been the company's largest source of revenue, but that had been hurting of late. The economy combined with consumers' tendency to print less and share pictures and document via e-mail and the Web was taking its toll on the division. But Hurd said that HP made fixing the printer unit a priority last year, and the results for the first quarter of 2010 showed improvement.
Consumer printer shipments grew 18 percent, while commercial printer shipments increased 11 percent compared to a year ago. Hurd said the shipments of the company's relatively new wireless printers doubled, and ink products tripled.
The PC market, which began to pick up at the end of 2009, should also continue to improve, Hurd said.
"We saw a pretty strong consumer demand. We expect that to continue in the first half of the year," he said. "We do expect a corporate refresh in the back half of the year."
Besides ink and printers and PCs, Hurd says his company will focus efforts on growing its sales staff.
"HP's salesforce is 50 percent larger than when I joined the company and we want to increase that number," he said Wednesday.
He said that the company is currently hiring for its direct sales and channel partners in growing international markets.
HP stock rose 48 cents to $50.57 in after-hours trading.
marți, 28 octombrie 2008
Plasma (physics)
In physics and chemistry, plasma is a partially ionized gas, in which a certain proportion of electrons are free rather than being bound to an atom or molecule. The ability of the positive and negative charges to move somewhat independently makes the plasma electrically conductive so that it responds strongly to electromagnetic fields. Plasma therefore has properties quite unlike those of solids, liquids or gases and is considered to be a distinct state of matter. Plasma typically takes the form of neutral gas-like clouds, as seen, for example, in the case of stars.This state of matter was first identified in a Crookes tube, and so described by Sir William Crookes in 1879 (he called it "radiant matter").The nature of the Crookes tube "cathode ray" matter was subsequently identified by British physicist Sir J.J. Thomson in 1897 and dubbed "plasma" by Irving Langmuir in 1928,perhaps because it reminded him of a blood plasma.Langmuir wrote:Except near the electrodes, where there are sheaths containing very few electrons, the ionized gas contains ions and electrons in about equal numbers so that the resultant space charge is very small. We shall use the name plasma to describe this region containing balanced charges of ions and electrons."Although a plasma is loosely described as an electrically neutral medium of positive and negative particles, a definition can have three criteria:
1. The plasma approximation: Charged particles must be close enough together that each particle influences many nearby charged particles, rather than just interacting with the closest particle (these collective effects are a distinguishing feature of a plasma). The plasma approximation is valid when the number of charge carriers within the sphere of influence (called the Debye sphere whose radius is the Debye screening length) of a particular particle are higher than unity to provide collective behavior of the charged particles. The average number of particles in the Debye sphere is given by the plasma parameter, "Λ" (the Greek letter Lambda).
2. Bulk interactions: The Debye screening length (defined above) is short compared to the physical size of the plasma. This criterion means that interactions in the bulk of the plasma are more important than those at its edges, where boundary effects may take place. When this criterion is satisfied, the plasma is quasineutral.
3. Plasma frequency: The electron plasma frequency (measuring plasma oscillations of the electrons) is large compared to the electron-neutral collision frequency (measuring frequency of collisions between electrons and neutral particles). When this condition is valid, electrostatic interactions dominate over the processes of ordinary gas kinetics.
1. The plasma approximation: Charged particles must be close enough together that each particle influences many nearby charged particles, rather than just interacting with the closest particle (these collective effects are a distinguishing feature of a plasma). The plasma approximation is valid when the number of charge carriers within the sphere of influence (called the Debye sphere whose radius is the Debye screening length) of a particular particle are higher than unity to provide collective behavior of the charged particles. The average number of particles in the Debye sphere is given by the plasma parameter, "Λ" (the Greek letter Lambda).
2. Bulk interactions: The Debye screening length (defined above) is short compared to the physical size of the plasma. This criterion means that interactions in the bulk of the plasma are more important than those at its edges, where boundary effects may take place. When this criterion is satisfied, the plasma is quasineutral.
3. Plasma frequency: The electron plasma frequency (measuring plasma oscillations of the electrons) is large compared to the electron-neutral collision frequency (measuring frequency of collisions between electrons and neutral particles). When this condition is valid, electrostatic interactions dominate over the processes of ordinary gas kinetics.
luni, 20 octombrie 2008
Plasma nitriding
A plasma is the 'fourth' state of matter, the other three being solid, liquid and gaseous states. In the plasma state, matter exists in its excited form, also called the 'ionized' form, meaning the electrons in the outermost orbit have been knocked off. Thus a plasma usually is an ionized gas, and a mixture of neutral as well as charged particles.There are hot plasmas typified by plasma jets used for metal cutting, welding, cladding or spraying. There are also cold plasmas, usually generated inside vacuum chambers, at low pressure regimes. Here the high temperature characteristics of the ionized gases are not used, but the electronic properties become more useful. Thus an ionized gas like nitrogen in such a low pressure regime becomes much more reactive. Thus surface treatment using the ionized nitrogen results in hardening of metals by two mechanisms :
by diffusion, since the diffiusivity of the excited nitrogen atoms is higher -can be up to ten times faster compared with gas nitriding, forming the Diffusion Zone where precipitation hardening is present, and
by thermo-chemical reactions which yield a thin layer of very hard iron and alloy nitrides, named Compound Layer or White Layer.
Usually steels, alloy steels etc. are very beneficially treated with plasma nitriding. Plasma nitriding advantage is related to the close control of the nitrided microstructure, allowing nitriding with or without compound layer formation. Not only the performance of metal parts gets enhanced but working lifespan gets boosted. So does the strain limit, and the fatigue strength of the metals being treated.
A plasma nitrided part is usually ready for use. It calls for no machining, or polishing or any other post-nitriding operations. Thus the process is user-friendly, saves energy since it works fastest, and causes little or no distortion. This process was invented by Dr. Bernhardt Berghaus of Germany who later settled in Zurich to escape persecution of his community by the Nazis in 1939. It was only after his death in late 1960s that the process was acquired by Klockner group and popularized world over.
Plasma nitriding is often coupled with PVD physical vapor deposition process and labelled Duplex Treatment, to avail of immensely enhanced benefits. Many users prefer to have a plasma oxidation step combined at the last phase of processing to generate a smooth jetblack layer of oxides which is very resistant to not only wear but corrosion.Gas Nitriding or diffusion nitriding or ammonia (diffusion) nitriding has been around for nearly a century. It is a simple and inexpensive method for surface hardening of 'nitriding alloys' i.e. alloy steels with nitride forming elements and other metals e.g. aluminium, chromium, molybdenum and titanium to name but a few.
Liquid bath nitriding or salt bath nitriding on the other hand is not a classical nitriding technique but rather a carbo-nitriding technique, since it employs rather toxic salts of sodium cyanide. These salts break up at high processing temperatures, as high as 510 degrees Celsius up to 610 degrees Celsius depending upon various factors. This technique produces only a thin 'chemical compound zone' which is hardly 4 to 10 micrometres thick, with no underlying hardened sub-surface. In developed nations, the use of such toxic salts is banned, whilst the process thrives in the Third World without any restrictions, or circumvention of slack laws and regulations.
Gas nitriding consumes a lot of power, apart from using ammonia which is not user-friendly and can lead to physiological problems. The treatment cycles can be as long as 7 to 9 days for achieving deeper cases. Usually only a diffused case of a few hundred micrometres is produced, and the thermo-chemically formed compound layer is machined off. This is called the white layer as it appears whitish under magnification with an optical microscope. Therefore gas nitrided parts are not ready for use once they are nitrided but have to be machined or polished to remove this layer, which is brittle in nature. These limitations were primarily due to use of ammonia NH3 which implies the quantity of hydrogen gas is three times that of nitrogen when ammonia dissociates at high temperature. Though this stoichiometry [ratio of nitrogen to hydrogen] is good for diffusion, it tends to produce a mixture of two phases of iron nitride. A monophase nitride can only be generated when there is reliable manipulation of the gas ratios. This used to be a severe limitation in gas nitriding, which has been rectified by use of sophisticated gas flow controls coupled with automation or computerized controls.
Plasma nitriding can successfully suppress the formation of white layer, or if needed, form a monophase layer which may be epsilon or gamma prime in nature. The former is harder and the latter has good tensile properties. Thus the user has a choice here, either a diffused layer can be generated where no extra-ordinary hard surfaces are called for, or a monophase thin top layer can be made available on top of the diffused layer as needed. The tailor-made nature of the nitrided layer in plasma nitriding is its great strength and versatility.
Though gas nitriding techniques with improved, separate flow channels for nitrogen and hydrogen have been introduced, plasma nitriding by nature is faster due to activation of the main reactive species. It can be up to ten times faster than classical gas nitriding, and produce much lower mechanical distortion due to its ability to use lower processing temperatures.
Automated or computerized controls used with plasma nitriding render it a formidable surface treatment which is basically inexpensive but very effective and easily reproducible. The main uses remain in the automotive sector for treatment of parts such as crankshafts, cams and camshafts, gudgeon pins, connecting rods and levers or actuators. In tools, a wide variety of high speed steel cutting tools, or metal forming tools made from the high carbon high chromium alloy steels, and in toolings, a huge variety of dies and moulds are routinely plasma nitrided to make them last ten times longer.
Lately plasma nitriding has been combined with an additional PVD [physical vapour deposition treatment] to increase its efficiency further. New thin film hard coatings like chromium nitride, aluminium titanium nitride and titanium carbo-nitride are often applied to a plasma nitrided metal article, usually in the same reactor vessel to avoid contamination and eliminate the need for extra cleaning. These are called Duplex Treatments.
Plasma nitriding combined with an oxidation step results into a highly corrosion resistant jetblack glossy layer which is very popular with the automotive industries.
by diffusion, since the diffiusivity of the excited nitrogen atoms is higher -can be up to ten times faster compared with gas nitriding, forming the Diffusion Zone where precipitation hardening is present, and
by thermo-chemical reactions which yield a thin layer of very hard iron and alloy nitrides, named Compound Layer or White Layer.
Usually steels, alloy steels etc. are very beneficially treated with plasma nitriding. Plasma nitriding advantage is related to the close control of the nitrided microstructure, allowing nitriding with or without compound layer formation. Not only the performance of metal parts gets enhanced but working lifespan gets boosted. So does the strain limit, and the fatigue strength of the metals being treated.
A plasma nitrided part is usually ready for use. It calls for no machining, or polishing or any other post-nitriding operations. Thus the process is user-friendly, saves energy since it works fastest, and causes little or no distortion. This process was invented by Dr. Bernhardt Berghaus of Germany who later settled in Zurich to escape persecution of his community by the Nazis in 1939. It was only after his death in late 1960s that the process was acquired by Klockner group and popularized world over.
Plasma nitriding is often coupled with PVD physical vapor deposition process and labelled Duplex Treatment, to avail of immensely enhanced benefits. Many users prefer to have a plasma oxidation step combined at the last phase of processing to generate a smooth jetblack layer of oxides which is very resistant to not only wear but corrosion.Gas Nitriding or diffusion nitriding or ammonia (diffusion) nitriding has been around for nearly a century. It is a simple and inexpensive method for surface hardening of 'nitriding alloys' i.e. alloy steels with nitride forming elements and other metals e.g. aluminium, chromium, molybdenum and titanium to name but a few.
Liquid bath nitriding or salt bath nitriding on the other hand is not a classical nitriding technique but rather a carbo-nitriding technique, since it employs rather toxic salts of sodium cyanide. These salts break up at high processing temperatures, as high as 510 degrees Celsius up to 610 degrees Celsius depending upon various factors. This technique produces only a thin 'chemical compound zone' which is hardly 4 to 10 micrometres thick, with no underlying hardened sub-surface. In developed nations, the use of such toxic salts is banned, whilst the process thrives in the Third World without any restrictions, or circumvention of slack laws and regulations.
Gas nitriding consumes a lot of power, apart from using ammonia which is not user-friendly and can lead to physiological problems. The treatment cycles can be as long as 7 to 9 days for achieving deeper cases. Usually only a diffused case of a few hundred micrometres is produced, and the thermo-chemically formed compound layer is machined off. This is called the white layer as it appears whitish under magnification with an optical microscope. Therefore gas nitrided parts are not ready for use once they are nitrided but have to be machined or polished to remove this layer, which is brittle in nature. These limitations were primarily due to use of ammonia NH3 which implies the quantity of hydrogen gas is three times that of nitrogen when ammonia dissociates at high temperature. Though this stoichiometry [ratio of nitrogen to hydrogen] is good for diffusion, it tends to produce a mixture of two phases of iron nitride. A monophase nitride can only be generated when there is reliable manipulation of the gas ratios. This used to be a severe limitation in gas nitriding, which has been rectified by use of sophisticated gas flow controls coupled with automation or computerized controls.
Plasma nitriding can successfully suppress the formation of white layer, or if needed, form a monophase layer which may be epsilon or gamma prime in nature. The former is harder and the latter has good tensile properties. Thus the user has a choice here, either a diffused layer can be generated where no extra-ordinary hard surfaces are called for, or a monophase thin top layer can be made available on top of the diffused layer as needed. The tailor-made nature of the nitrided layer in plasma nitriding is its great strength and versatility.
Though gas nitriding techniques with improved, separate flow channels for nitrogen and hydrogen have been introduced, plasma nitriding by nature is faster due to activation of the main reactive species. It can be up to ten times faster than classical gas nitriding, and produce much lower mechanical distortion due to its ability to use lower processing temperatures.
Automated or computerized controls used with plasma nitriding render it a formidable surface treatment which is basically inexpensive but very effective and easily reproducible. The main uses remain in the automotive sector for treatment of parts such as crankshafts, cams and camshafts, gudgeon pins, connecting rods and levers or actuators. In tools, a wide variety of high speed steel cutting tools, or metal forming tools made from the high carbon high chromium alloy steels, and in toolings, a huge variety of dies and moulds are routinely plasma nitrided to make them last ten times longer.
Lately plasma nitriding has been combined with an additional PVD [physical vapour deposition treatment] to increase its efficiency further. New thin film hard coatings like chromium nitride, aluminium titanium nitride and titanium carbo-nitride are often applied to a plasma nitrided metal article, usually in the same reactor vessel to avoid contamination and eliminate the need for extra cleaning. These are called Duplex Treatments.
Plasma nitriding combined with an oxidation step results into a highly corrosion resistant jetblack glossy layer which is very popular with the automotive industries.
3D optical data storage
3D optical data storage is the term given to any form of optical data storage in which information can be recorded and/or read with three dimensional resolution (as opposed to the two dimensional resolution afforded, for example, by CD)This innovation has the potential to provide terabyte-level mass storage on DVD-sized disks. Data recording and readback are achieved by focusing lasers within the medium. However, because of the volumetric nature of the data structure, the laser light must travel through other data points before it reaches the point where reading or recording is desired. Therefore, some kind of nonlinearity is required to ensure that these other data points do not interfere with the addressing of the desired point.
No commercial product based on 3D optical data storage has yet arrived on the mass market, although several companies are actively developing the technology and predict that it will become available by 2010.Current optical data storage media, such as the CD and DVD store data as a series of reflective marks on an internal surface of a disc. In order to increase storage capacity, it is possible for discs to hold two or even more of these data layers, but their number is severely limited since the addressing laser interacts with every layer that it passes through on the way to and from the addressed layer. These interactions cause noise that limits the technology to approximately 10 layers. 3D optical data storage methods circumvent this issue by using addressing methods where only the specifically addressed voxel (volumetric pixel) interacts substantially with the addressing light. This necessarily involves nonlinear data reading and writing methods, in particular nonlinear optics.
3D optical data storage is related to (and competes with) holographic data storage. Traditional examples of holographic storage do not address in the third dimension, and are therefore not strictly "3D", but more recently 3D holographic storage has been realized by the use of microholograms. Layer-selection multilayer technology (where a multilayer disc has layers that can be individually activated e.g. electrically) is also closely related.
Schematic representation of a cross-section through a 3D optical storage disc (yellow) along a data track (orange marks). Four data layers are seen, with the laser currently addressing the third from the top. The laser passes through the first two layers and only interacts with the third, since here the light is at a high intensity.
Schematic representation of a cross-section through a 3D optical storage disc (yellow) along a data track (orange marks). Four data layers are seen, with the laser currently addressing the third from the top. The laser passes through the first two layers and only interacts with the third, since here the light is at a high intensity.
As an example, a prototypical 3D optical data storage system may use a disk that looks much like a transparent DVD.The disc contains many layers of information, each at a different depth in the media and each consisting of a DVD-like spiral track. In order to record information on the disc a laser is brought to a focus at a particular depth in the media that corresponds to a particular information layer. When the laser is turned on it causes a photochemical change in the media. As the disc spins and the read/write head moves along a radius, the layer is written just as a DVD-R is written. The depth of the focus may then be changed and another entirely different layer of information written. The distance between layers may be 5 to 100 micrometers, allowing >100 layers of information to be stored on a single disc.
In order to read the data back (in this example), a similar procedure is used except this time instead of causing a photochemical change in the media the laser causes fluorescence. This is achieved e.g. by using a lower laser power or a different laser wavelength. The intensity or wavelength of the fluorescence is different depending on whether the media has been written at that point, and so by measuring the emitted light the data is read.
It should be noted that the size of individual chromophore molecules or photoactive color centers is much smaller than the size of the laser focus (which is determined by the diffraction limit). The light therefore addresses a large number (possibly even 109) of molecules at any one time, so the medium acts as a homogeneous mass rather than a matrix structured by the positions of chromophores.The origins of the field date back to the 1950s, when Yehuda Hirshberg developed the photochromic spiropyrans and suggested their use in data storage.In the 1970s, Valeri Barachevskii demonstrated that this photochromism could be produced by two-photon excitation, and finally at the end of the 1980s Peter T. Rentzepis showed that this could lead to three-dimensional data storage.This proof-of-concept system stimulated a great deal of research and development, and in the following decades many academic and commercial groups have worked on 3D optical data storage products and technologies. Most of the developed systems are based to some extent on the original ideas of Rentzepis. A wide range of physical phenomena for data reading and recording have been investigated, large numbers of chemical systems for the medium have been developed and evaluated, and extensive work has been carried out in solving the problems associated with the optical systems required for the reading and recording of data. Currently, several groups remain working on solutions with various levels of development and interest in commercialization (see below).Although there are many nonlinear optical phenomena, only multiphoton absorption is capable of injecting into the media the significant energy required to electronically excite molecular species and cause chemical reactions. Two-photon absorption is the strongest multiphoton absorbance by far, but still it is a very weak phenomenon, leading to low media sensitivity. Therefore, much research has been directed at providing chromophores with high two-photon absorption cross-sections.Writing by 2-photon absorption can be achieved by focusing the writing laser on the point where the photochemical writing process is required. The wavelength of the writing laser is chosen such that it is not linearly absorbed by the medium, and therefore it does not interact with the medium except at the focal point. At the focal point 2-photon absorption becomes significant, because it is a nonlinear process dependent on the square of the laser fluence.Writing by 2-photon absorption can also be achieved by the action of two lasers in coincidence. This method is typically used to achieve the parallel writing of information at once. One laser passes through the media, defining a line or plane. The second laser is then directed at the points on that line or plane that writing is desired. The coincidence of the lasers at these points excited 2-photon absorption, leading to writing photochemistry.In microholography, focused beams of light are used to record submicrometre-sized holograms in a photorefractive material, usually by the use of collinear beams. The writing process may use the same kinds of media that are used in other types of holographic data storage, and may use 2-photon processes to form the holograms.
No commercial product based on 3D optical data storage has yet arrived on the mass market, although several companies are actively developing the technology and predict that it will become available by 2010.Current optical data storage media, such as the CD and DVD store data as a series of reflective marks on an internal surface of a disc. In order to increase storage capacity, it is possible for discs to hold two or even more of these data layers, but their number is severely limited since the addressing laser interacts with every layer that it passes through on the way to and from the addressed layer. These interactions cause noise that limits the technology to approximately 10 layers. 3D optical data storage methods circumvent this issue by using addressing methods where only the specifically addressed voxel (volumetric pixel) interacts substantially with the addressing light. This necessarily involves nonlinear data reading and writing methods, in particular nonlinear optics.
3D optical data storage is related to (and competes with) holographic data storage. Traditional examples of holographic storage do not address in the third dimension, and are therefore not strictly "3D", but more recently 3D holographic storage has been realized by the use of microholograms. Layer-selection multilayer technology (where a multilayer disc has layers that can be individually activated e.g. electrically) is also closely related.
Schematic representation of a cross-section through a 3D optical storage disc (yellow) along a data track (orange marks). Four data layers are seen, with the laser currently addressing the third from the top. The laser passes through the first two layers and only interacts with the third, since here the light is at a high intensity.
Schematic representation of a cross-section through a 3D optical storage disc (yellow) along a data track (orange marks). Four data layers are seen, with the laser currently addressing the third from the top. The laser passes through the first two layers and only interacts with the third, since here the light is at a high intensity.
As an example, a prototypical 3D optical data storage system may use a disk that looks much like a transparent DVD.The disc contains many layers of information, each at a different depth in the media and each consisting of a DVD-like spiral track. In order to record information on the disc a laser is brought to a focus at a particular depth in the media that corresponds to a particular information layer. When the laser is turned on it causes a photochemical change in the media. As the disc spins and the read/write head moves along a radius, the layer is written just as a DVD-R is written. The depth of the focus may then be changed and another entirely different layer of information written. The distance between layers may be 5 to 100 micrometers, allowing >100 layers of information to be stored on a single disc.
In order to read the data back (in this example), a similar procedure is used except this time instead of causing a photochemical change in the media the laser causes fluorescence. This is achieved e.g. by using a lower laser power or a different laser wavelength. The intensity or wavelength of the fluorescence is different depending on whether the media has been written at that point, and so by measuring the emitted light the data is read.
It should be noted that the size of individual chromophore molecules or photoactive color centers is much smaller than the size of the laser focus (which is determined by the diffraction limit). The light therefore addresses a large number (possibly even 109) of molecules at any one time, so the medium acts as a homogeneous mass rather than a matrix structured by the positions of chromophores.The origins of the field date back to the 1950s, when Yehuda Hirshberg developed the photochromic spiropyrans and suggested their use in data storage.In the 1970s, Valeri Barachevskii demonstrated that this photochromism could be produced by two-photon excitation, and finally at the end of the 1980s Peter T. Rentzepis showed that this could lead to three-dimensional data storage.This proof-of-concept system stimulated a great deal of research and development, and in the following decades many academic and commercial groups have worked on 3D optical data storage products and technologies. Most of the developed systems are based to some extent on the original ideas of Rentzepis. A wide range of physical phenomena for data reading and recording have been investigated, large numbers of chemical systems for the medium have been developed and evaluated, and extensive work has been carried out in solving the problems associated with the optical systems required for the reading and recording of data. Currently, several groups remain working on solutions with various levels of development and interest in commercialization (see below).Although there are many nonlinear optical phenomena, only multiphoton absorption is capable of injecting into the media the significant energy required to electronically excite molecular species and cause chemical reactions. Two-photon absorption is the strongest multiphoton absorbance by far, but still it is a very weak phenomenon, leading to low media sensitivity. Therefore, much research has been directed at providing chromophores with high two-photon absorption cross-sections.Writing by 2-photon absorption can be achieved by focusing the writing laser on the point where the photochemical writing process is required. The wavelength of the writing laser is chosen such that it is not linearly absorbed by the medium, and therefore it does not interact with the medium except at the focal point. At the focal point 2-photon absorption becomes significant, because it is a nonlinear process dependent on the square of the laser fluence.Writing by 2-photon absorption can also be achieved by the action of two lasers in coincidence. This method is typically used to achieve the parallel writing of information at once. One laser passes through the media, defining a line or plane. The second laser is then directed at the points on that line or plane that writing is desired. The coincidence of the lasers at these points excited 2-photon absorption, leading to writing photochemistry.In microholography, focused beams of light are used to record submicrometre-sized holograms in a photorefractive material, usually by the use of collinear beams. The writing process may use the same kinds of media that are used in other types of holographic data storage, and may use 2-photon processes to form the holograms.
Ultra Density Optical
Ultra Density Optical (UDO) is an optical disc format designed for high-density storage of high-definition video and data.An Ultra Density Optical disc or UDO is a 133.35 mm (5.25") ISO cartridge optical disc which can store up to 60 GB of data.[clarify] Utilising a design based on a Magneto-optical disc, but using Phase Change technology combined with a blue violet laser, a UDO disc can store substantially more data than a magneto-optical disc or MO, because of the shorter wavelength (405 nm) of the blue-violet laser employed. MOs use a 650 nm-wavelength red laser. Because its beam width is shorter when burning to a disc than a red-laser for MO, a blue-violet laser allows more information to be stored digitally in the same amount of space.
Current generations of UDO store up to 60 GB, and a 120 GB version of UDO is in development and is expected to arrive in 2007 and after, though up to 500 GB has been speculated as a possibility for UDO.According to Plasmon, desktop UDO drives are priced at around US $3200. A 30GB UDO Write Once is US $60.Originally an optical disc storage medium developed as a replacement for the Magneto-optical digital storage medium, Ultra Density Optical was developed beginning June 2000 and first announced by Sony on November 1st 2000.[3] It was later adopted with heavy investment by Plasmon, a UK technology company with extensive experience with computer archival backup systems and solutions.Currently UDO is being championed by its development partners Plasmon (Company), Hewlett Packard, Asahi Pentax (responsible for the opto-mechanical assembly design), Mitsubishi Chemical, parent company of the Verbatim media storage brand, and various computer and IT solutions companies. Mitsubishi Chemical is the second major development partner of UDO media.UDO uses a Phase Change recording process that permanently alters the molecular structure of the disc surface.There are three versions of UDO 30: a True WORM (Write Once Read Many), an R/W (Re-Writable), and Compliant WORM (shreddable WORM).The UDO Rewritable format uses a specially formulated Phase Change recording surface that allows recorded data to be deleted and modified. In practice, UDO Rewritable media operates like a standard magnetic disc. Files can be written, erased and rewritten, dynamically reallocating disc capacity. Rewritable media is typically used in archive applications where the stability and longevity of optical media is important, but the archive records change on a relatively frequent or discretionary basis. Rewritable media is typically used in archive environments where data needs to be deleted or media capacity re-used.UDO systems use a blue-violet laser operating at a wavelength of 405 nm, similar to the one used in Blu-ray, to read and write data. Conventional MOs use red lasers at 660 nm.[6]
The blue-violet laser's shorter wavelength makes it possible to store more information on a 13 cm sized UDO disc. The minimum "spot size" on which a laser can be focused is limited by diffraction, and depends on the wavelength of the light and the numerical aperture of the lens used to focus it. By decreasing the wavelength, using a higher numerical aperture (0.85, compared with 0.575 for MO), the laser beam can be focused much more tightly. This produces a smaller spot on the disc than in existing MOs, and allows more information to be physically stored in the same area.The opto-mechanism design of current Plasmon UDO drives was jointly developed with Asahi Pentax.
Current generations of UDO store up to 60 GB, and a 120 GB version of UDO is in development and is expected to arrive in 2007 and after, though up to 500 GB has been speculated as a possibility for UDO.According to Plasmon, desktop UDO drives are priced at around US $3200. A 30GB UDO Write Once is US $60.Originally an optical disc storage medium developed as a replacement for the Magneto-optical digital storage medium, Ultra Density Optical was developed beginning June 2000 and first announced by Sony on November 1st 2000.[3] It was later adopted with heavy investment by Plasmon, a UK technology company with extensive experience with computer archival backup systems and solutions.Currently UDO is being championed by its development partners Plasmon (Company), Hewlett Packard, Asahi Pentax (responsible for the opto-mechanical assembly design), Mitsubishi Chemical, parent company of the Verbatim media storage brand, and various computer and IT solutions companies. Mitsubishi Chemical is the second major development partner of UDO media.UDO uses a Phase Change recording process that permanently alters the molecular structure of the disc surface.There are three versions of UDO 30: a True WORM (Write Once Read Many), an R/W (Re-Writable), and Compliant WORM (shreddable WORM).The UDO Rewritable format uses a specially formulated Phase Change recording surface that allows recorded data to be deleted and modified. In practice, UDO Rewritable media operates like a standard magnetic disc. Files can be written, erased and rewritten, dynamically reallocating disc capacity. Rewritable media is typically used in archive applications where the stability and longevity of optical media is important, but the archive records change on a relatively frequent or discretionary basis. Rewritable media is typically used in archive environments where data needs to be deleted or media capacity re-used.UDO systems use a blue-violet laser operating at a wavelength of 405 nm, similar to the one used in Blu-ray, to read and write data. Conventional MOs use red lasers at 660 nm.[6]
The blue-violet laser's shorter wavelength makes it possible to store more information on a 13 cm sized UDO disc. The minimum "spot size" on which a laser can be focused is limited by diffraction, and depends on the wavelength of the light and the numerical aperture of the lens used to focus it. By decreasing the wavelength, using a higher numerical aperture (0.85, compared with 0.575 for MO), the laser beam can be focused much more tightly. This produces a smaller spot on the disc than in existing MOs, and allows more information to be physically stored in the same area.The opto-mechanism design of current Plasmon UDO drives was jointly developed with Asahi Pentax.
Holographic Versatile Disc
The Holographic Versatile Disc (HVD) is an optical disc technology that would hold up to 3.9 terabytes (TB) of information. It employs a technique known as collinear holography, whereby two lasers, one red and one green, are collimated in a single beam. The green laser reads data encoded as laser interference fringes from a holographic layer near the top of the disc while the red laser is used as the reference beam and to read servo information from a regular CD-style aluminum layer near the bottom. Servo information is used to monitor the position of the read head over the disc, similar to the head, track, and sector information on a conventional hard disk drive. On a CD or DVD this servo information is interspersed amongst the data.
A dichroic mirror layer between the holographic data and the servo data reflects the green laser while letting the red laser pass through. This prevents interference from refraction of the green laser off the servo data pits and is an advance over past holographic storage media, which either experienced too much interference, or lacked the servo data entirely, making them incompatible with current CD and DVD drive technology.These discs have the capacity to hold up to 3.9 terabytes (TB) of information, which is approximately 5,800 times the capacity of a CD-ROM, 850 times the capacity of a DVD, 160 times the capacity of single-layer Blu-ray Discs, and about twice the capacity of the largest computer hard drives as of August 2008. The HVD also has a transfer rate of 1 Gbit/s (125 MB/s). Optware was expected to release a 200 GB disc in early June 2006, and Maxell in September 2006 with a capacity of 300 GB and transfer rate of 20 MB/s.On June 28, 2007 HVD standards have been approved and published.Current optical storage saves one bit per pulse, and the HVD alliance hopes to improve this efficiency with capabilities of around 60,000 bits per pulse in an inverted, truncated cone shape that has a 200 micrometer diameter at the bottom and a 500 micrometer diameter at the top. High densities are possible by moving these closer on the tracks: 100 GB at 18 micrometers separation, 200 GB at 13 micrometers, 500 GB at 8 micrometers and a demonstrated maximum of 3.9 TB for 3 micrometer separation on a 12 cm disc.
The system uses a green laser, with an output power of 1 watt, a high power for a consumer device laser. So a major challenge of the project for widespread consumer markets is to either improve the sensitivity of the polymer used, or develop and commoditize a laser capable of higher power output and suitable for a consumer unit.HVD is not the only technology in high-capacity, optical storage media. InPhase Technologies is developing a rival holographic format called Tapestry Media, which they claim will eventually store 1.6 TB with a data transfer rate of 120 MB/s, and several companies are developing TB-level discs based on 3D optical data storage technology. Such large optical storage capacities compete favorably with the Blu-ray Disc format. However, holographic drives are projected to initially cost around US$15,000, and a single disc around US$120–180, although prices are expected to fall steadily.The market for this format is not initially the common consumer, but enterprises with very large storage needs.
A dichroic mirror layer between the holographic data and the servo data reflects the green laser while letting the red laser pass through. This prevents interference from refraction of the green laser off the servo data pits and is an advance over past holographic storage media, which either experienced too much interference, or lacked the servo data entirely, making them incompatible with current CD and DVD drive technology.These discs have the capacity to hold up to 3.9 terabytes (TB) of information, which is approximately 5,800 times the capacity of a CD-ROM, 850 times the capacity of a DVD, 160 times the capacity of single-layer Blu-ray Discs, and about twice the capacity of the largest computer hard drives as of August 2008. The HVD also has a transfer rate of 1 Gbit/s (125 MB/s). Optware was expected to release a 200 GB disc in early June 2006, and Maxell in September 2006 with a capacity of 300 GB and transfer rate of 20 MB/s.On June 28, 2007 HVD standards have been approved and published.Current optical storage saves one bit per pulse, and the HVD alliance hopes to improve this efficiency with capabilities of around 60,000 bits per pulse in an inverted, truncated cone shape that has a 200 micrometer diameter at the bottom and a 500 micrometer diameter at the top. High densities are possible by moving these closer on the tracks: 100 GB at 18 micrometers separation, 200 GB at 13 micrometers, 500 GB at 8 micrometers and a demonstrated maximum of 3.9 TB for 3 micrometer separation on a 12 cm disc.
The system uses a green laser, with an output power of 1 watt, a high power for a consumer device laser. So a major challenge of the project for widespread consumer markets is to either improve the sensitivity of the polymer used, or develop and commoditize a laser capable of higher power output and suitable for a consumer unit.HVD is not the only technology in high-capacity, optical storage media. InPhase Technologies is developing a rival holographic format called Tapestry Media, which they claim will eventually store 1.6 TB with a data transfer rate of 120 MB/s, and several companies are developing TB-level discs based on 3D optical data storage technology. Such large optical storage capacities compete favorably with the Blu-ray Disc format. However, holographic drives are projected to initially cost around US$15,000, and a single disc around US$120–180, although prices are expected to fall steadily.The market for this format is not initially the common consumer, but enterprises with very large storage needs.
Abonați-vă la:
Postări (Atom)