Science
Teaching computers a new way to count could make numbers more accurate
Changing the way numbers are stored in computers could improve the accuracy of calculations without needing to increase energy consumption or computing power, which could prove useful for software that needs to quickly switch between very large and small numbers.
Numbers can be surprisingly difficult for computers to work with. The simplest are integers – a whole number with no decimal point or fraction. As integers grow larger, they require more storage space, which can lead to problems when we attempt to reduce those requirements – the infamous millennium…
Source link