Common Misconceptions About Technology

There are many misconceptions about technology that may confuse people into believing a complete myth that isn’t true. Some examples of these misconceptions are that “Macs don’t get viruses” and “More megapixels means better pictures”, which are just not true. This blog post will disprove each of these misconceptions about different aspects of technology.

Mac Computers Can’t Get Viruses

Many people, mostly Apple fanboys, believe that MacBooks and iMacs are invulnerable to viruses, malware and other electronic hazards. The reason for this is most likely due to the fact that 80% of the computer industry is owned by Windows and since they are more commonly used, a hacker will have more attention towards Windows users.[1] This misconception became so immense that Apple Co. even made an ad about it (see above) to get more publicity and to use it as a selling point. This misconception was proven wrong when a Trojan named “Flashback” infected more than 600,000 Macs in 2012.[2] This caused Apple to change their advertisement on their website and changed the perspective of thousands of Apple users.

Huspeni, Andrea. June 26, 2012. Online Image. April 23, 2015.

Huspeni, Andrea. June 26, 2012. Online Image. April 23, 2015.

More Megapixels Always Means A Better Camera

Many people are convinced that the more megapixels a camera has, the better it is, especially on new phones such as an iPhone or a Samsung that continue to best each other in who can make a better camera. The truth is that there isn’t a huge difference between an 8-megapixel camera and a 12-megapixel camera. The quality of a picture is determined by how much light the sensor of a camera is able to take in. Normally, a bigger sensor would come with bigger pixels, and the bigger the pixels, the more light it will absorb.[3] So the quality of a picture factors more on the size of the megapixels than the number of the mega pixels. Professional photographer, Matthew Panzarino, discusses the iPhone 5S camera as well as what a megapixel is here.

The World Wide Web And The Internet Are The Same Thing.

No Author. Augest 6, 2011. Online Image. April 23, 2015.

No Author. August 6, 2011. Online Image. April 23, 2015.

The Internet and the world wide web are not the same. The Internet is “the infrastructure that connects networks across the world, including both the hardware (computers, servers, cables and more) and the software.”[4] while the web is just one of those networks. We access this network through browsers such as Chrome, Firefox, Safari, plus countless others. Most people use this network to watch videos on Youtube, play flash games, look at ridiculous memes, and to even read blog posts like the one you’re reading right now.

64-Bit OS Runs Faster Than a 32-Bit OS

There is no difference in speed between a 64-bit OS (Operating System) and a 32-bit OS. They both would operate at the same speeds if they had the same computer components. The only difference between these a 64-bit OS and a 32-bit OS is that in 64-bit OS, you can use more than 4GB of RAM (Random-Access Memory).[1] If you had 8GB of RAM on a 32-bit OS, then the computer would only be able to use 4GB of it. This might affect gameplay when playing video games on a PC that thrive off the computers RAM so that on a 64-bit OS, it might have a higher FPS (frames per second) depending on the game and the amount of RAM that the game runs off of. On other things such as going to the web or listening to music, there would be no difference at all.


What are some misconceptions about technology that you know and how are they wrong?