Connect with us

Computers

The evolution of CPU: The future of processors in the next 10 years

Published

on

One thing is clear — the CPU won’t be the way it used to be. It isn’t going to be just better, it’s going to be different. When it comes to modern technology, time flies really fast. If you think about the central processing unit, you’ll probably imagine one of AMD or Intel’s creations.

The CPU has undergone many transformations to become what it looks like today. The first major challenge it faced dates back to the early 2000s when the battle for performance was in full swing.

Back then, the main rivals were AMD and Intel. At first, the two struggled to increase clock speed. This lasted for quite a while and didn’t require much effort. However, due to the laws of physics, this rapid growth was doomed to come to an end.

According to Moore’s Law, the number of transistors on a chip was to double every 24 months. Processors had to become smaller to accommodate more transistors. It would definitely mean better performance. However, the resultant increase in temperature would require massive cooling. Therefore, the race for speed ended up being the fight against the laws of physics.

It didn’t take long for the solution to appear. Instead of increasing clock speeds, producers introduced multiple-core chips in which each core had the same clock speed. Thanks to that, computers could be more effective in performing multiple tasks at the same time.

The strategy ultimately prevailed but it had its drawbacks, too. Introduction of multiple cores required developers to come up with different algorithms so the improvements could be noticeable. This wasn’t always easy in the gaming industry where the CPU’s performance had always been one of the most important characteristics.

Another problem is that the more cores you have, the harder it is to operate them. It is also difficult to come up with a proper code that would work well with all the cores. In fact, if it was possible to develop a 150 GHz single-core unit, it would be a perfect machine. However, silicon chips can’t be clocked up that fast due to the laws of physics.

The problem became so widely discussed that even the education sector joined in. If you have to come up with a paper related to this or a similar issue, you can turn to a custom essay service to ensure the best quality. Anyway, we will try to figure out the future of the chips ourselves.

H2: Quantum Computing

Quantum computing is based on quantum physics and the power of subatomic particles. Machines based on this technology are a lot different from the ones we have in our homes. For example, conventional computers use bits and bytes, whilst quantum machines are all about the use of qubits. Two bytes can have only one of these: 0-0, 0-1, 1-0, or 1-1. Qubits can store all of them at the same time, which allows quantum computers to process an immense amount of data simultaneously.

There is one more thing you should know about quantum electronics, namely quantum entanglement. The thing is that quantum particles exist in pairs. If one particle reacts in a particular way, the other one does the same. This property has been used by the military for some time in their attempts to replace the standard radar. One of the two particles is sent into the sky, and if it interacts with an object, its ‘ground-based’  counterpart reacts as well.

Quantum technology can also be used to process an immense amount of information. Unlike conventional computers, qubit-based ones process data thousands of times faster. Apart from that, forecasting and modeling complex scenarios are where quantum computers excel as well. They are capable of modeling various environments and outcomes, and as such can be extensively used in physics, chemistry, pharmaceutics, weather forecasting, etc.

However, there are some drawbacks, too. Such computers aren’t of much use these days and can serve only for certain purposes. This is mainly because they require special lab equipment and are too expensive to operate.

There is another issue connected with the development of the quantum computer. The top speed at which silicon chips can operate today is much lower than the one needed to test quantum technologies.

H2: Graphene Computers

Discovered in 2004, graphene gave rise to a new wave of research in electronics. This super-effective material possesses a couple of features which will allow it to become the future of computing.

Firstly, it is capable of conducting heat faster than any other conductor used in electronics, including copper. It can also carry electricity two hundred times faster than silicon.

The top clock speed silicon-based chips can work at reaches 3-4 GHz. This figure hasn’t changed since 2005 when the race for speed challenged physical properties of silicon and brought them to the limit. Since then, scientists have been looking for a solution that could allow us to overcome the maximum clock speed that silicon chips can provide. And that’s when the discovery of graphene was made.

Thanks to graphene, scientists managed to achieve a speed which was a thousand times higher than that of silicon chips. Graphene-based CPUs turned out to consume a hundred times less energy than their silicon counterparts. On top of that, they also allow for smaller size and greater functionality of the devices having them.

Today, there is no actual prototype of this computing system. It still exists only on paper. But the scientists are struggling to come up with a real model that will revolutionize the world of computing.

However, there is one drawback. Silicon serves as a good semiconductor that is able not only to carry electricity but also to retain it. Graphene, on the other hand, is a ‘superconductor’ that carries electricity at a super-high speed but cannot retain the charge.

As we all know well, the binary system requires transistors to turn on and off when we need them to. It lets the system retain a signal in order to save some data for later use. For example, it is vital for RAM chips to keep the signal. Otherwise, our programs would shut down the moment they opened.

Graphene fails to retain signals because it carries electricity so fast that there is almost no time between the ‘on’ and ‘off’ signals. It doesn’t mean that there is no place for graphene-based technologies in computing. They still can be used to deliver data at the top speed and could probably be used in chips if they are combined with another technology.

Apart from the quantum and graphene technologies, there are some other ways for the CPU to develop in the future. Nevertheless, none of them seems to be more realistic than these two.

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Computers

AMD Acquires Xilinx for $35 Billion

Published

on

By

AMD has finally confirmed that this week it will be acquiring FPGA maker Xilinx for $35 billion. Xilinx makes FPGA or Field Programmable Gate Array chips. Rumor had it that AMD was buying Xilinx for $30 billion.

FPGA chips can be reconfigured for a wide variety of specialized tasks. These chips are in high demand within the automotive industry, cellular base stations, and other vertical markets. They are programmed to carry out specific tasks.

To analysts, the move means that AMD is boosting its Artificial Intelligence capability to keep up with competitor Nvidia in working with Machine Learning as well as Inference.

According to Semico Research, Xilinx’ Machine Learning and Artificial Intelligence capabilities will be a boost for AMD as it moves into AI and ML.

AMD will be in a position to diversify and penetrate new markets that they are yet to get into and which would require massive investments. These are markets like Telcom, Industrial, and Automotive.

Both AMD and Xilinks had invested heavily in their data centers, and the new acquisition means that AMD is in a better position to compete with Nvidia.

Better Leadership

The deal reflects the steady growth of AMD stock from $2 five years ago to $78.88. The company is worth $100 billion.

In going for an all-stock deal, the company has avoided taking on debt which can potentially harm the company.

AMDA is following the example of Intel which acquired Altera in 2015 for $16.7 billion. Altera was the top competitor for Xilinx before the acquisition; which did not turn out well for Intel, due to leadership issues.

The coming together of Xilinx and AMDA is a unification of two companies that can easily complement each other’s strong points.
The deal is the fruition of years of talks between the two companies. The Xilinx leader will remain on board after the merger.

Continue Reading

Computers

Self-Erasing Chips could Enhance Security and Curb Counterfeits

Published

on

By

Scientists from the University of Michigan are experimenting with self-erasing chips that make it easier to tell when an electronic device has been tampered with.

The self-erasing chips are capable of sending alerts whenever someone tampers with a sensitive shipment.

The chips were built with a new material that temporarily stores up energy reserves and emits a different color of light as it does so. The chip will erase itself within days using blue light.

At the moment, it is difficult to tell whether counterfeiters have tampered with an electronic device. The device may still operate normally but will be providing a third party with information, according to Assistant Professor Parag Deotare.

A self-erasing bar code on the chip will enable the owner to know when someone opens it and installs a listening device, for example. Bar codes on circuit boards and integrated circuit chips can provide evidence that the items were not tampered with in transit.

The bar codes can also be built to last longer so that they are written into the device for example as software authorization keys.

Researchers laid a three atom-thick semiconductor on a layer of molecules on azobenzenes which shrinks when it comes into contact with Ultra Violet light. These molecules will in turn pull on the semiconductor so that it emits light in longer wavelengths.

You can only read the message when you are looking at it under a specific kind of light. Researchers are also interested in this material as a medium for transmitting secret messages. The message will self-destruct after a while, but it can also be illuminated with blue light which will erase it.

Upon stretching, azobenzene gradually dissipates its stored energy over 7 days as long as it remains in the dark. When exposed to light, this period becomes shorter, and the erased chip can be used to record a different bar code or message.

The semiconductor is very much like nanomaterial but it can emit light in specific frequencies.

Jinsang Kim, a professor of material science and engineering designed the material together with Da Seul Yang, a doctoral student of macromolecular science and engineering. They coated it with the molecules by floating a layer of Nano molecules in water and dropping a silicon wafer into the water so that it comes out coated with the molecules.

Next, the researchers will be working to create a similar material that preserves the message intact for longer than a week and this will improve its use for anti-counterfeit measures.

The Air Force Office of Scientific Research is funding the research, and the University of Michigan is pursuing patents and commercial partners to take this technology to the market.

Continue Reading

Computers

What is the future for printing?

Published

on

By

From the humble printing press to 3D printers – this is an industry that has experienced big change. Even in a digital world, printing isn’t dead either. According to Quocirca’s Global Print 2025 study, 64% of businesses said they believe printing will remain important to their daily business even by 2025. So here’s what we can look forward to in the future.

It will become more environmentally friendly

Millions of trees are used for paper and millions of cartridges are sent to landfill each year – but that trend is being reversed. We can anticipate that printing will become much more environmentally friendly in the future to ensure it keeps up with the times. Whether that is only using recycled cartridges and paper or buying high-capacity XL ink cartridges which can last longer efficiency and green concerns will be at the forefront. 

Eco modes will be common in every model and will offer ever-greater eco-friendly performance.

The composition of ink has even been reconsidered to help it last as long as possible too. New formulae have been created to help reduce the ink drying up in the cartridge and it getting wasted.

3D printing will become more advanced

Printing has traditionally been a 2D affair – on paper, card, fabric and plastic. However, in recent times 3D printing has come into the mainstream spotlight. These use materials in place of ink or toner and formulate solid products.

The technology is now even being used to create organs.

  • Researchers at the University of Minnesota created a 3D printed prototype bionic eye and in the UK scientists have used stem cells to 3D print human corneas. 
  • The Netherlands have printed a tooth which can kill bacteria.
  • Switzerland has been successful in creating a 3D printed silicone heart. 

This is where there is room to grow, however. 3D printed organs will transform medicine and enhance people’s lives. Currently, the silicone heart can only beat up to 3,000 times (the average heart beats 80 times a minute, meaning the 3D printed organ will only last 37.5 minutes). While this is a short time, it’s progress. A foundation has now been set and the future will probably see fully-functioning organs coming off the printer.

Printing will become easier

Printing has already been made pretty easy. Once upon a time, it was impossible to print a document without your computer being tethered to it by a cable. Now printers have wi-fi capabilities meaning you can click print on your laptop, computer, mobile phone or tablet – regardless of whether you’re connected with a wire or not. Some printers even have the ability to print when you’re not near it. In fact, you could be out shopping and want to send something to the printer for when you get home via a designated email belonging to your printer. In the future, we may see this become the norm on all printers, making the whole process of printing quicker and easier – and taking the current cutting edge functions to the mainstream.

AI could be an everyday appearance

Artificial intelligence (AI) can play a huge part in the printing industry. In an office setting, for example, it could help to enhance security – with printed materials being scanned to auto approve entry to buildings or access to a printer restricted to employees with the correct permissions.

An ‘intelligent’ printer can also provide forecasts on when you can expect to run out of ink or toner, or when you may need the printer servicing – and your printer could even order more for you. 

Printing has already transformed and evolved so much and as technology also grows, we can expect printing to continue. From the humble printing press to being able to create a heart – printing is not dead yet.  

Continue Reading

Trending

Copyright © 2020 HiTECH