Moore's Law
Today, when we hear someone use Moore's Law as a predictor of something, it's almost never about the doubling of the number of transistors that can fit on a chip. Instead, it's usually in reference to the doubling of speed or the doubling of processing power or the doubling of some other relevant metric of computing strength.
In the case of memory (e.g., RAM), an increase in the number of transistors means an increase in the number of memory circuits that can be built from those transistors. And more memory means more data can be stored and/or manipulated.
Today, when we hear someone use Moore's Law as a predictor of something, it's almost never about the doubling of the number of transistors that can fit on a chip. Instead, it's usually in reference to the doubling of speed or the doubling of processing power or the doubling of some other relevant metric of computing strength.
In the case of memory (e.g., RAM), an increase in the number of transistors means an increase in the number of memory circuits that can be built from those transistors. And more memory means more data can be stored and/or manipulated.
![Picture](/uploads/6/9/9/0/69906685/screen-shot-2017-01-20-at-8-21-00-am.png?250)
The mathematical property of Moore's Law that makes it such a powerful concept lies in the fact that it describes an exponential growth in the capacity and performance of electronic components. Rather than seeing a steady pattern of incremental improvements over time (a linear pattern), we instead see an accelerating increase in performance (an exponential pattern). That is, every year doesn't just see the same improvement as the previous year. Instead, each year sees more improvement (twice as much, in fact) than what was achieved the previous year. And the following year will see an even greater increase in improvement.
Decidability and Efficiency
As we saw with searching and sorting algorithms, there are often many different approaches to solving the same problem. Multiple solutions may be equally valid with no single one being exclusively better than the other. In fact, for most of the problems that you'll encounter in your life, there is almost never just one, single way that the problem can be solved. Instead, there are almost always multiple ways of solving a given problem — some of which might be better choices than others.
Your challenge as a computational thinker is to 1) design a solution that works and 2) recognize the ways in which your algorithm can be made more efficient.
In the case of searching algorithms, Binary Search proved to be much more efficient than Sequential Search for those lists that were already in sorted order. For sorting algorithms, Insertion Sort was more efficient than Selection Sort or Bubble Sort. But there are other, more advanced algorithms, like Merge Sort and QuickSort, that can achieve even better performance.
As we saw with searching and sorting algorithms, there are often many different approaches to solving the same problem. Multiple solutions may be equally valid with no single one being exclusively better than the other. In fact, for most of the problems that you'll encounter in your life, there is almost never just one, single way that the problem can be solved. Instead, there are almost always multiple ways of solving a given problem — some of which might be better choices than others.
Your challenge as a computational thinker is to 1) design a solution that works and 2) recognize the ways in which your algorithm can be made more efficient.
In the case of searching algorithms, Binary Search proved to be much more efficient than Sequential Search for those lists that were already in sorted order. For sorting algorithms, Insertion Sort was more efficient than Selection Sort or Bubble Sort. But there are other, more advanced algorithms, like Merge Sort and QuickSort, that can achieve even better performance.