Wednesday, 6 July 2016

Lets Begin

Hello Again,
 
We have all used a computer. And we all know how to create files, open them, copy files, and all other things. Basically we all know how to operate a computer. We can do anything on a computer. So, how does the computer understand what we are trying to do? It doesn't have a mind. So, how does it does all those things?

Basically, computer is nothing but a set of capacitors and conductors.  It just operates differently when the input current is altered. Hence comes the 0s and 1s. When current is passed into a  hardware part it is taken as 1 and when there is no current it is taken as 0.
So, based on current on , current off the device performs differently. So, it can be said that computer only understands the language of 0s and 1s.

Now that raises another question. How does computer understand all that we say just by 0s and 1s?
Actually the number of zeros and ones are not limited.
For example, "1000100010111" is different from"100010" which is also different from "100011".
As there is no limit the number of times 0s and 1s are used, computer can understand a lot of things.

But how do we know how many 0s and how many 1s to be written?
There always a possibility of concurrency and duplication . Lets say, i want my computer to take 101 as A , 1011 as B and so on. But my friend's computer takes 1001 as A, 11001 as B and so on. So, here comes a problem. As computer was to be used world wide it was not possible for everyone to have their own key assignments, it had be same everywhere. A lot of companies tried to assign standard values. And at last ASCII (American Standard Code for Information Interchange) came into existence.

No comments:

Post a Comment

If u need any help,