Table of Contents
Computer science students typically spend at least one or two semesters studying computer architecture (hardware design) and machine/assembly language. Most will never become hardware designers, or assembly language programmers, but understanding how data are represented in computer hardware makes all of us better programmers.
Knowing the limitations of computer number systems is necessary in order to write programs that produce correct and precise output and also in writing the most efficient programs possible.
This chapter provides a very brief overview of these limitations. We will cover the standard number systems used by modern computers alongside some historical and hypothetical systems to help put them in context.
What are two advantages of understanding computer data representation if we are not doing hardware design?