MythBusters Chapter 2- Do computers get damaged when not shut down for a week?


 


MythBusters is a new series by The 8-Bit, where we’ll bring you Myths that exist in the tech space and try to crack ‘em.


After almost a week quarreling against the tortures of time, Siddharth Ahuja, The 8-Bit author, managed to complete the daunting paper assigned to him by his teacher. Just about when he was heading to save his work, his three-months-old old laptop from HP crashed.

It won’t turn on. Apparently, it was time that the computer met its death.

Sid, with an overtly anxious face, turned to his friend in despair. He felt as if the whole world had crushed him under his own deeds. There was no hope. No one could, and did save him from facing the dire consequences. But there was one thing he vouched for since then, he would never in posterity- leave his computers turned on for weeks on end.

Apparently, it turns out that humans think of the stuff around them as topical and mortal; just like themselves. But it’s not usually the case. Machines, unlike humans, are capable of laboring 24/7 without any need of maintenance, or rest. Sure, it’s unholy to outrightly compare humans to machines, but just for once, glance into this thought: humans need to shut themselves off at night to see the sun next day, but expecting a machine- which is created with the intent to ease human life- to stop doing its work- even for a shorter span- is wrong thinking.

This thought supports the conclusion to Siddharth’s destitute instance. Later, on diagnosis, it was found that his computer crashed due to a severe injection of a virus that effectively corrupted the hard drive. Thanks to some data regeneration software, he was able to regain his paper that he worked his butt off for. Although, it was too late then.

People like Siddharth are a clear representation of the illiteracy in terms of the latest improvements and enhancements in the ever-expanding globe of technology. Even though the virus affecting his PC wasn’t entirely his fault, him, blaming technology without the actual knowledge of its functioning was a blunder. Such “myths” prevail throughout the globe in millions of people’s minds; myths that can limit what a person can achieve with a man’s gift to himself- the computer.

Before we get to the reasons why this myth is no longer rampant, let’s just get to the roots of it.

How did computers come into existence and what did they pack back then?

A computer can be exemplified as a device that can be instructed to carry out arithmetic or logical operations automatically via computer programming. Even though that’s the oldest way you could ever define a computer, it still holds partially true today. A revolutionist and an inventor- Charles Babbage is meant to be the one who invented the first “mechanical” computer in the early 19th century. He did advance in this direction by creating more variations of the computer. But, due to problems related to funding and certain other political agendas, he never was really able to produce a type of computer that represented something close to what we have today.

However, the world later saw another messiah which practically saved the computers from vanishing into the deep abyss we call undocumented history. His name was Alan Turing, and he was an all-rounder when it came to science. Although, his most appreciated efforts were considered to be in the radius of complex computation, which led to further advancements resulting in the computer we have today.

Early computers (I’m talking about the 1990’s) were more mechanical than software. Basically, everything ranging from the initial boot-up to the shut-down had some involvement with motile mechanisms. We had the floppy disk as a go-to storage device, a mouse was only a living animal that spread illnesses, and the monitor worked when you hit it hard on the top.

Software, at the time, itself wasn’t much developed. It was just implemented so that users could perform basic functions. There was no optimization. A better way to describe this is to say that the software and the hardware didn’t really have any connection beyond functional tasks. Then came the hard-drives…

Early Computers

The first hard drive was created by IBM and it had the capacity to store 2.5 Gigs. But, there was a catch- it weighed 550 pounds and was a size of a refrigerator. Then came the laser discs, which stored memory in color. But they too were hefty enough to fit the need of acquiring enough compactness to deem it apt for daily use. Later, floppy disks and magnetic tapes followed.

Suddenly, then, the world of computers witnessed a revolutionary change- compact hard drives. These had the capacity to store more data, last longer, and had a form factor which was suitable for commercial use. To date, some of us still use a hard drive. Well, they are pretty common.

Old technology pitted against the new: Where does the myth stand?

I’ve already mentioned what computers rocked with in the ‘90s. However, there’s no denying that we are in the middle of another revolution where computers no longer have to be programmed by humans; they program themselves with the help of machine learning and neural networks. But, does it stage the claim that keeping computers turned on at all times is normal? Well, it’s a bit complicated than it seems.

A few years ago, we used to just depend on regular hard drives for our digital storage needs. Regular hard drives have a rotating disc inside which stores and releases information as it moves. But, it’s mechanical. And history, along with science is evidence that anything mechanical is someday going to meet its demise. That being said, hard drives kind of fueled the now-turned misconception that computers wear off if they are kept on for too long.

Later, when the SSDs (Solid State Drive) were made available to the market, they changed how we used computers, along with other inconsequential factors that need not be accounted into importance. The SSDs transformed the then-fact that computers needed to shut down regularly into a modern-day myth. Sure, other factors also are equally responsible, but storage has been the primary concern for decades, and it still is the case.

What’s the importance of shutting down a computer?

As a matter of fact, deciding whether to shut down a computer or to leave it turned on doesn’t matter much today. But, at the same time, I feel that you must know why people before used to think to shut down computers, was the right thing to do.

Cooling mechanisms

The only thing that helped a computer cool down was a fan embedded into a CPU. There was no adequate software optimization to ensure that all the components worked as well as they should.

Using photoshop required a lot of graphics intensity and RAM usage which, after use, needed to cool down. So, most of the times fans did the job. But, if you’d leave your computer on overnight, waiting for a torrent to download, the computer would probably crash with next day’s usage. This was because the fan wasn’t capable enough to keep the components cool enough to warrant proper functioning. One after the other, all the mechanisms would fail resulting in a bunked PC.

Hard drives

Hard drives not only store transferable data but also host the OS’ installation files, and they are prominent aspects on which an OS works.

The problem with regular hard drives, or as you may recall- Hard Disk Drives (HDD), is that they have a motile circular disk that rotates and stores information. Leaving the computer on at all times could damage the hard drive in more than one ways. When computers are turned on, constant electric surges from the power source disrupt the movement of the hard drive. And repeated disruptions are more likely to fail it.

Software

When a baby is born, it doesn’t have the necessary senses activated right off the bat. But, as it grows into a toddler, senses become powerful and the little dude acquires enough intelligence to discern stuff around itself. That leads it to become more curious and it starts questioning the existence of objects in the surrounding.

This is exactly how software evolved over the years. Initially, it was only capable of adding, subtracting, multiplying, and dividing numbers. But, with increasing demand from various fields including space, education, organization, etc. the need for a mechanism that would automate processes increased exponentially. And this is how software evolved to the point we see today.

However, jumping back a few years, the software was capable to fly airplanes but wasn’t capable to effectively handle a personal computer. Boot-ups and shut-downs were not managed efficiently and there was minimal interaction between the components and the software. This inefficiency garnered by the software laid the path for crashes when a computer was left turned on the whole week. This indirectly affected the mechanical parts and altogether brought a system down.

Memory leaks

While you may not have come across this term, it is one of the most crucial aspects when it comes to programming a software. Memory leaks have been unequivocally responsible for major computer crashes in the past. In practical terms, it has a direct relation to failures in computers due to long up-times.

Basically, a memory leak is a type of resource leak that occurs when a computer program incorrectly manages memory allocations in such a way that memory which is no longer needed is not released. And when computers used to be kept on for weeks on end, at times when software was not optimized to handle memory leaks efficiently, they crashed.

 

 

 

 

 

In fact, I’ve also heard of an instance (which I don’t know if it’s true or not) where an airplane suffered an engine failure because of a lethal memory leak in the plane’s autopilot function. Thankfully, the plane never took off, and no lives were lost.

These reasons made it difficult for people to actually keep their computers on for weeks, and, also ignited the flames beneath the modern-day myth that computers can’t stand being not turned off for days. It just boils down to the fact that these days the risk of your device meeting its death by being on for days on end is entirely dependent on how old your computer is!

So, Should you shut your computer down at night?

The answer conveniently depends on how you use your computer and what you need it to do. Well, the trends have certainly changed since the past decade. People are more than ever dependent on technology and unquestionably cannot live without it.

Regular office work, limited time on YouTube, and an hour on MySpace- this was the ideal scenario back in the 2000s. But since portability and seamlessness have invaded our current lives for good, no one has to worry about reaching back to the computer to do a task. Yet, portability comes at a cost; one which cannot ever be surmised by a mere device such as a smartphone. Try to edit a video on your iPhone, for instance, and you’ll know the importance of a full-fledged personal computer.

Anyway, the point is, it is people who have advanced into liberal mindsets rather than computers. We demand more from technology that can provide just so much, and in a positive way; it blows air towards further development into breakthrough technologies. No matter what, you’ll always feel the need for a laptop or a desktop to skillfully finish a task.

Taking the current pace of life into consideration, it is necessarily important to avoid wasting time just booting up the computer when you need it to do the most spectacular stuff you could ever imagine. This is not a pep talk, but I feel this must be let out. Aside from the materialistic aspects of the reasons why you shouldn’t turn off your PC, having your computer on at all times will enable you to create stuff much faster, help you not lose inspiration, and probably save you from exasperatingly time-consuming Windows updates.

Portability and seamlessness have invaded our current lives for good.

Just to get things straight for the last time, imagine you’ve just woken up and you’ve brewed a perfect cup of coffee for yourself. You decide to watch some morning briefings on your laptop. Remember, you have a cup of coffee in one hand and you’re struggling to boot up your computer with the other. By the time your computer boots up, your smoking hot coffee has turned into a cold slump. You curse the coffee. Curse the laptop. And you keep cursing everything throughout the day. Now imagine if things went otherwise. You just tapped on a key on the keyboard, and the computer’s display lit up like Christmas lights in a split second. You watched something great, and your day went 100% more productive.

Well, modern computers are built for the exact purpose- you should be able to use them as much as you want.

Constantly turning a computer on and off slowly destroys it. Most components in a PC are like light bulbs which make them wear out much easier if you repeatedly keep turning it on and off. Also, directly setting apart a computer power cable from its power source could do some drastic damage to the internals of the computer due to interference by power surges.

On top of everything, the fact that modern computers are built for the purpose of handling intense workflows is utterly justifiable. Operating systems these days are adept at maintaining a proper balance; a proper cohesion between the internals- the hardware, and the software. Everything just falls together by itself and you wouldn’t have to worry even for a second.

There is competent garbage collection in order to prevent memory leaks, and the mechanical aspects of the computer are controlled by software now more than ever. There’s a simple scientific theory that establishes a relation between work done and the heat generated: the more work you do, the more heat you generate. Similarly, the more up-time a computer has, the more heat it will generate. And heat, currently, is kind of the worst enemy for a computer.

However, modern mechanisms and software together have a solution for that. CPUs, for instance, reduce their efficiency for a while in order to give themselves some space to cool down so that it doesn’t affect the computer in the long run. Similarly, other components as well cool themselves down. But how do they know it’s time to do so? Software.

The fan plays a major role in cooling down a system. While there are certainly other advanced cooling systems, fans are the most affordable and are widely available. Nevertheless, fans too wear out on continuous rotation. This is due to the amount of dust it attracts while in motion. While this could bog down a computer, cleaning the fan regularly might help it to last longer. It’s not just fans, though; dust is a major problem for computers. That implies cleaning it must be a routine.

Now, laptops are somewhat different from usual desktops. And the major difference between itself and a desktop that a laptop endorses is the inclusion of a battery. It clearly changes how things work, and adds one more thing to worry about if you’d have blindly followed the myth to date.

It’s vaguely difficult to explain how a battery works in just a paragraph, while clearly, it warrants a separate post. But, it’s fairly important to know that if you keep your laptop turned on for days, it won’t affect the battery at all. Okay, so in order to keep a laptop alive for, let’s say- a week, you’d need to keep it plugged in. This is where another myth interferes with the current myth in introspection.

According to the myth, people often think that keeping a laptop always plugged into a power source could damage its battery. But, just as every other mechanism, laptop batteries have also evolved into smartness. There’s no harm in keeping your laptop plugged in cause the battery shuts itself off for extra power once it knows it has completed a charge. That being said, the issue of batteries getting destroyed due to longer PC up-times is also a myth.

Bottom line: your computer will take some damage, no matter you use it vigorously or not at all. A Capacitor, for example, is a prominent component in a computer’s infrastructure, as they can wear out just by sitting there since they usually have a liquid inside of them that can leak or evaporate out.

Conclusion

Myths have had deep roots in fake news. People blindly believe what they hear from someone else. This is why Sid lost a great share of his grades to a stupid virus that he could’ve fixed right away if he wouldn’t have cursed himself over keeping his laptop turned on for too long.

All it takes is some different thinking and an altogether different approach towards stuff. I mean this is how people like you and me innovate, develop, and succeed.


Previously: MythBusters Chapter 1- Can a smartphone damage a credit card?