The Futurist

Web Name: The Futurist

WebSite: http://www.singularity2050.com

ID:84667

Keywords:

The,Futurist,

Description:

We are very near to being able to declare absolute victory on the ATOM thesis.Remember that March 15, 2020 really was the Netscape Moment in Economics. The US Fed Funds rate, which was the only major rate in the world that was foolishly high at that point, went from 1.5% down to 0% (permanently), and trillions in new monetary creation commenced. As of August, the four major central banks are at +35.3% on a year-over-year basis (source : Yardeni). Meanwhile, the US 10-yr Treasury Note languishes at 0.7% yield, the weighted average yield of all high-grade 10-yr Sovereign Bonds worldwide is at approximately 0.00%, and oil remains below $40/barrel even now, while the tech-laden Nasdaq 100 continues to make new all-time highs. What more proof is required, that monetary creation a) does not cause inflation up to a pretty high annual rate of creation, and b) this creation finds its way into technology, to produce more technology? Now, we get the benefit of probing were the ceiling of the monetary creation gradient might be. I have maintained in the ATOM publication that 16-24% was the optimal rate of increase (based on my own proprietary research about the depth of technological density and acceleration), with a lower number resulting in insufficient inflation and the higher number causing brief inflation. Now, we happen to see a 35.3% net YoY increase. This is well above the band I specified above, but it also follows a period of slack, which means the CAGR over the last several years is still well below the 24%/yr upper bound. If the current YoY increase is in fact an overshoot above the optimal zone, there will be a very brief blip in the CPI. This will cause the disgruntled inflation hawks and PhD Economists to emerge from the woodwork to point out how the entire ATOM thesis is wrong . They will be suitably embarrassed yet again, since the blip will be very brief once the trendline of 16-24% catches up. As we can see from the second chart, the CPI is just not having it. Nor is the Goldman Sachs Commodity Index, which represents worldwide prices of all commodities (oil, gold, natural gas, silver, coffee, etc.). It is down a whopping 60% from its 2010 levels, despite all the QE. Even this index does not represent the accurate scale of commodity deflation, since I contend that computational power, storage, and bandwidth should all be commodities in this index (as volatility already is, despite not being a physical form). Inclusion of these components would reveal a faster as well as more accurate deflationary picture. This trend can only continue and accelerate through the 2020s and beyond. Also note how large the base of cumulative monetary action now is. As we see from the chart, the YoY dollar amount is $7 Trillion, and this is just for the four largest central banks (which amount to 85% of all monetary creation). Just to stay at 16% YoY growth for the next 365 days, another $4.3 Trillion has to be done. As I said in June :I said elsewhere that the decade of the 2010s had $23 Trillion of cumulative QE worldwide. The PhD Economists of the world, who have predicted 100 of the last zero bouts of hyperinflation, still believe QE is an aberration and assume that the cumulative QE will be reversed (i.e. that the 2020s will have -$23 Trillion of cumulative QE). I claim the opposite, which is that under both ATOM principles and the Accelerating Rate of Change, the 2020s will see about $100 Trillion of QE, and that this will move towards sending cash directly to people (rather than the esoteric bond-buying that comprises of QE today, which inevitably concentrates the benefit of this monetary creation in very few hands). Does anyone doubt that the 2020s will in fact see $100 Trillion of QE? The first eight months of 2020 are certainly on track for that trend. That means it is also on track for a greater diffusion of future monetary creation. The current channels are super-inefficient, super-saturated, and frankly, one could scarcely devise a better way for all new monetary creation to go just to the wealthiest tech billionaires while average people get nothing. Furthermore, while bad governance can destroy anything (and this sort of new safety net actually increases the level of bad governance, as the penalties are delayed), the fact that the central banks of the world reacted so quickly means that a number negative economic phenomena might very well be in the past. For example :i) There may never be a traditional recession again, based on the technical definition of a recession, which is two consecutive quarters of negative Real GDP.ii) There may never again be a stock market correction so severe that the S P 500 remains over 10% below its all-time high for a full calendar year. iii) The S P 500 may never again go more than three years without making an all-time high. Remember that dividends (about 1.7%/yr) also exist. Points ii) and iii) above prove that the equity index, rather than gold, is the true safe haven. The gradient of progress in the modern era is just too steep for the multi-year recessions of the past to happen anymore barring the worst governance. The divergence between the performance of gold vs. that of the Nasdaq 100 over the last decade is extreme. The proof is piling up. The Economics PhD ivory tower cannot continue their denial forever, as they already are in the dustbin of history. Yes, most recent articles here have been very similar, but remember that we are in the midst of a seminal historical turning point that almost no others have caught on to yet. Update : For those worried about Money Supply, note that M1 has increased 42% YoY, and M2 about 24%. This is at a level where even I thought there could be inflation, since M1 is the most liquid and rapidly-circulated pool of money. Such inflation could happen, but has not happened yet. If big increases in even M1 have not caused inflation (still TBD), then the case for ATOM-DUES is even stronger, as one of the last few unknowns has been exposed as a non-event. Related ATOM Chapters :2 : The Exponential Trendline of Economic Growth4 : The Overlooked Economics of Technology For this month s award, we will delve into a disruption that took place a few years ago, and how that disruption is about to have a second act, accelerated by coronavirus. Hence, it is related to the June 2020 ATOM AotM. Streaming video has already disrupted the film and television industries completely. Due to recent events, it is now on the brink of its second act. First, a recap of the first act. Netflix was the first and most well-known among existing streaming services, of which there are now over a dozen. The disruptions inherent to this were immense and across multiple sectors. Blockbuster alone had 9094 stores at its peak in 2004, alongside Movie Gallery s 4700, and these were not small stores in square footage and parking lot space. There were other, smaller video rental chains as well. Collectively, they consumed thousand upon thousands of acres of prime urban land. The removal of this entire resource allocation and supply chain was immensely productive, and rapid. Additionally, the entire structure of how films and television are produced was upgraded into a more productive version. The old paradigm of 26 episodes per season, and commercial breaks every few minutes was extremely restrictive. Most one hour programs from the 1980s and 90s were barely 40 minutes of real content. All of that has been swept aside with the technological revelation of on-demand streaming. The exceptionally low price of most streaming services qualify as a true ATOM disruption. Try explaining to young people today how, in the early 1980s, people had to rush home to not miss their favorite program or a movie that was being shown on TV for the first time (including commercials for a third of that time). But the second act is the disruption of the next industry, education, by the same medium. The seeds were sown in 2015, with only the 800,000 Chinese students paying full price delaying the manifestation of this ATOM disruption. Coronavirus has shuttered in-person schooling, which has forced students online. Yet, universities, in their hubris, want to charge the same fee. This invited comparisons that universities are ill-equipped to rebut, and are uncannily similar to what taxi medallions said when Uber disrupted their entitled status quo. Suddenly, questions of both cost and duplication of effort began to move to the forefront. The following list summarizes the extent of the mismatch, even though the Harvard annual price actually represents just the eight-month academic year. Furthermore, $50,000/yr for Harvard is probably still a better value than $40,000/yr for a university ranked closer to #50. Of course, this table is for dramatic effect, and the disruption is going to be in lower-value education (K-12 and lower-tier universities) first. Articles written about this disruption have appeared in important non-tech publications :The AtlanticHarvard Business ReviewNew York MagazineThe emperor has no clothes moment for the ridiculously expensive education industry draws near. People are beginning to finally figure out how little value (or even negative value) they are getting for their money, combined with the acceleration of technological alternatives. This is something I have been anticipating for some time, and thus had become one of the most overdue disruptions around. The key, of course, is for employers to have the courage to buck the status quo. Employers have been unusually timid about hiring entry-level candidates without degrees, or even creating their own onsite training programs. For example, there is no reason why a large tech company cannot simply hire the 18-year-olds with the highest SAT scores, and put clusters of them into on-site training programs, and even house them in local dormitory-like apartments near the office campus. If the SAT score is not enough info, students can also add a dossier of their accomplishments and a writing sample. Effectively, the college application for which a student pays $50-$100 for the privilege of submitting, can be an online upload for free, accessible by login only by verified employer HR staff email addresses. Once hired, the tech company mentors the candidate, and commits to three years of employment, so that their resume is sufficiently solid in the absence of the university credential. Each US military branch has basic training that is of 7-12 weeks in duration, so there is no reason tech companies cannot have a 3-year training program for 18-year-olds. As long as other tech companies recognize the training from the first (Google, Amazon, etc.) as valid, the graduates can circulate throughout the tech industry, and the monopoly of universities is broken. In addition, the premise that the contacts they make at a university are more valuable than the contacts they make in their first three years at Google, Amazon, or Tesla is absurd. The truth about college is that a person s best friends usually are not going into the same professional field, whereas this link is met in the training program. Even if their compensation net of free housing is very little, that is still a vastly higher net compensation relative to a university degree. But alas, these supposedly innovative tech companies have not yet demonstrated the courage to bypass college. Fortunately, the ATOM might do the hard work of normalizing streaming education at the same cost as other streaming content, bypassing the need for employers to wait for a Spartacus . This long-overdue correction of a massive resource misallocation may finally be upon us. Related :The Education Disruption : 2015 Related ATOM Chapters :3. Technological Disruption is Pervasive and Deepening8. The ATOM Transformation by Sector9. Reframing Inequality The pandemic has ratified and accelerated a whole host of ATOM principles, so I have to update parts of the entire publication. Suffice it to say, a number of pent-up ATOM predictions just got fast forwarded, with a seminal day in the history of economics having been forced into manifestation. We can divide the events into two parts : technological and monetary. Among technological disruptions, there are three that qualify as having been overdue for a long time, that got tipped over by this catalyst :1) Video Conferencing : This was something that Cisco expected to take off 14 long years ago, but expensive proprietary hardware and the inertia of old habits prevented it from attaining the critical mass necessary for entrenchment. Cisco lost at least $6 Billion on this endeavor. Now, however, as people are forced to work from home, a critical mass of users have to adapt to this usage, which in turn attracts more innovation and capital to the technology. While none of the companies advancing videoconferencing in 2009 are the same ones as the ones winning now, this is common in the technology sector (recall the search engine wars). The cascade of disruptions I listed in 2009 still apply. Among other things, if cubicle-style workplaces can agree that all on-premise meetings are restricted to three days a week (M-W, or Tu-Th, or whatever), then the distance that an employee can commute effectively doubles, and the housing availability for them thus increases 4X. The current status quo of certain real estate being vastly more expensive than equivalent real estate 30 miles further from the jobs cluster may finally correct. This is a form of standard-of-living increase that is poorly captured in GDP statistics. 2) Educational Institutions : The extraordinarily distorted cost/value equation of both higher and lower educational institutions (which should not be conflated with the concept of education ) already crossed the point of no return in 2015. But, as with videoconferencing, too few people were willing to be Spartacus and make use of alternative solutions that were in fact lower risk. This applies to both students and employers, for employers declaring that they will hire based on on-site testing and online certifications, rather than degrees that bear little to predictive value of employee performance, is the catalyst that would have induced more students to bypass the universities-as-gatekeeper oligopoly. The fact that universities want to charge the same tuition for online classes (and are being sued by students disputing this), when comparable online classes are available for orders of magnitude lower prices, is going to reduce US university enrollment permanently. To cope, there is no reason that US universities cannot be forced to return to a 1980s-era cost structure. 3) Retail Real Estate Re-Purposement : Overlooked among the technological and economic effects of this black swan is the fact that the retail apocalypse and shift to e-commerce has fast-forwarded to such an extent that millions of acres of US retail land (including parking lots) will never return to previous levels of usage. This was partially mentioned in the ATOM AotM for August 2017, and is often brought up in comments. E-commerce was still just 12% of all retail sales before the pandemic, but if that 12% were to shift to 15%, or effectively jump two years ahead of the previous trend, that alone is a vast acceleration with visible results for the suburban landscape. In fact, when you combine the permanently lower demand for premium office space from the greater usage of videoconferencing, and the mass closure of retail real estate (at least in the US, where six times as much land is allocated to retail relative to most advanced countries), the correction and pressure to re-allocate could be extreme. In places like California, the extreme restrictions on new residential construction will be exposed even more visibly as office space joins retail in a permanent glut. But the bigger event was not even these technological accelerants. Instead, the complete and supreme validation of all ATOM conclusions was manifested fully. Recall that the Federal Reserve was actually reversing QE and increasing interest rates in 2019. It had begun to pause and correct that misguided reversal process, but still at too timid of a rate of net increase to even keep up with the ATOM trendline of monetary creation required to halt technological deflation. However, this crisis forced the Federal Reserve to do the right thing, even if they still don t understand the new economics of technology. March 15, 2020, is a day that can fairly be described as the Netscape Moment in Economics . For those who recall the original Netscape Moment , on August 9, 1995, the Internet browser company Netscape did an IPO that exceeded its anticipated price by a huge margin, and triggered a boom in Internet company formation for the next 4.5 years. Even after the bust, the economy was permanently into the Internet age. Similarly, 3/15/2020 is the day where the Federal Reserve, in one fell swoop, lowered the Fed Funds rate to 0% (where it should have been all along), and signaled permanent QE. In the following 10 weeks, over $3 Trillion of new QE was done, and the entire trajectory is starting to look more like the exponential parabolas that we are accustomed to seeing wherever the accelerating rate of change and exponential technology emerge. As of May 31, here are two charts to depict the total QE effect (source : Yardeni) : The first chart indicates the cumulative rise in the sum of the four major central banks. Note the feeble attempt to reduce the balance sheets in 2018-19, only the forced to return to the trendine. The second chart is the YoY percentage increase. I have always said that the ATOM requires 16-24%/yr as an annual rate of increase to offset deflation and maintain optimal (2-3%) inflation. The increase is now probing the upper limit of even my range, and it will be interesting to see if inflation emerges even then, or if the ceiling is even higher than I estimated (meaning that technological progress is now even faster and broader than before, and monetary creation could be higher than before). I said elsewhere that the decade of the 2010s had $23 Trillion of cumulative QE worldwide. The PhD Economists of the world, who have predicted 100 of the last zero bouts of hyperinflation, still believe QE is an aberration and assume that the cumulative QE will be reversed (i.e. that the 2020s will have -$23 Trillion of cumulative QE). I claim the opposite, which is that under both ATOM principles and the Accelerating Rate of Change, the 2020s will see about $100 Trillion of QE, and that this will move towards sending cash directly to people (rather than the esoteric bond-buying that comprises of QE today, which inevitably concentrates the benefit of this monetary creation in very few hands). Mark my words. The entire profession of economics, full of PhDs who have never had any contact with entrepreneurs and real-time economic decisions, will be wrong by an epic margin. The time has arrived. The second version of the ATOM Publication is now out. This includes about 10% more text, updated data, charts, and incorporation of the Sovereign Venture Fund idea.The first version got about 400,000 visits, and was featured in a Google Talk and a variety of other television programs. Long-timers who haven t read it in years should give it another examination, while newcomers should dive in deep. We are going to hit a higher level of outreach and PR in the near future, and I may need some volunteers. It is time to award a new ATOM AotM, and the first one somewhat coincident with the newly published version 2.0 of the ATOM publication. This one has been discussed elsewhere on this website, but at the moment, it is perhaps the single biggest disruption in the global economy. First, a small story. Photovoltaics (PV) is actually my first exponential technology, and what made be aware of the concept in a time when even Moore s Law was not a household term and Intel was a very small company. When I was 10 years old, I wrote a 5th grade paper (about 800 words) on exponential improvement of solar cells of about 5% a year. It predicted cost-effectiveness in the early 21st century . The paper was a hit and was submitted to a contest that all the elementary schools submitted papers to, and I was taken, along with a number of other 5th graders from the Cleveland area to an event where the Mayor of Cleveland at the time (George Voinovich) meets the children and does a photo op. We didn t understand any of that, but each child got a framed certificate. Sure enough, as the decades passed, PV did become cost effective in the early 21st century. It has been a subject on this website for a long time, and while we often point out how many technologies have failed to meet industry-derived projections, as you can see from this old 2007 article, a US DoE chart thought PV installations in the US would be merely 15 to 30 GWs. In reality, the 2020 number is about thrice the upper bound of that projection, at 80-90 GWs, with over 3% of US electricity generated through PV. Even better, the world average is higher than the US average, and the world total continues to grow in excess of 25% a year. Remember that this ATOM advancement is tied to the advancement of electric vehicles, as not merely is crude oil being replaced with electricity. For many countries, the oil was imported, while the photovoltaic electricity is generated domestically. This is a victory against OPEC, as imports from OPEC are being replaced with domestic energy. As we recall Swanson s Law, the 40-year trend is consistent (note that both axes are logarithmic). While it took decades to get up to 3% of world electricity consumption being through PV, the jump from 3% to 12%+ will not take longer than a few years. The inflection point is here, and the dollar impact is among the biggest of any ATOM transformations currently underway. Even more than the dollar impact, it is the multiplier effect of these specific dollars given where the shift is being generated. (Images from Wikipedia and Our World in Data. Click to enlarge). The best part about solar, which is not true of wind power, is that the poorest countries in the world are in fact the ones with the greatest solar intensity, and are thus the most suitable for solar. Much has been written about why cold-weather cultures have done better than hot-weather cultures on average, but now the very resource that was not being monetized can be monetized. By contrast, wind power, while good, is both slower-advancing and most applicable in countries that are already wealthy, and so does not have quite the same multiplier effect. This map of solar intensity indicates where the greatest utility of photovoltaics can reside. The absolute lowest-prosperity countries are still too disorganized to take advantage of this, but even the third quartile is well-prepared to rapidly increase PV installations and move away from oil imports. In the near future, it will seem quite absurd that people were importing their energy from thousands of miles away. Related :The End of Petrotyranny - VictoryThe End of PetrotyrannySolar Power s Next Five Game-Changing TechnologiesA Future Timeline for EnergyWhy I Want Oil to Hit $120 per Barrel (epic 2007 article) Related ATOM Chapters :3. Technological Disruption is Pervasive and Deepening The nature of the subject matter covered here leads to relatively few commenters (even though readership is still over 200 visits/day), despite the extraordinary importance of subjects covered. Frankly, without commenter Geoman s quality + quantity of comments over the last decade plus, we would not be above minimum comments level for a viable commenter community. I haven t thought about giving recognition for the best comment of a particular year in the past, but a comment several months ago has stood out as exceptional. At 1026 words, it is article-length in its own right. It is from commenter HB : Kartik, Geoman,Yes, status symbols and prop 13 may be contributing but the main reason for SV housing bubble inflation is regulatory.In my view, the pivotal legislative change that made this possible happened subtly some decades ago (1970s) when California mandated the creation of regional planning powers (region-wide urban boundaries and such) enabling housing supply restrictions under the appealing name of smart-growth , a marketing name for a host of housing supply restrictions which in reality is a pretty dumb suicidal move that voter-lemmings seem to easily embrace the world over. So now due to the regional planning committee powers the residents of Silicon Valley cannot only block new development in their own cities but, more importantly, can block the creation of even new cities in their entire commutable region, plus also block housing supply increases in neighboring cities. Were it not for that, some cities would see opportunity and break ranks and increase housing supply, or entire new cities would spring up nearby to be quickly populated by new residents from the rest of the country/world and professionals escaping SV housing prices. Businesses would then follow the migrating and expanding talent. But all this can -- and is-- blocked by the expanded California regional housing powers which have been populated by an enviro-NIMBY coalition, with many voters openly approving and even more voters silently consenting, as in the case of busy immigrants.For example, one of the potent repressive housing supply restriction tools of smart growth is the ability of any minority to block developments that would benefit large majorities.A small five person minority group of native established stay at home housewives, who lucked out and bought their SV houses in the seventies, who realize they got a good deal and want to freeze their good luck in time, who spend their ample idle time picking invasive weeds on some SV hillside, can go to city hall and block an entire housing project for two hundred or more immigrants or out of Valley Americans with thousands of times the productive modern world capacity compared to the five housewives. And, as Kartik points out, the two hundred immigrants and their families are too humble in their new environment and primarily too busy to go down to a city hall where the voice of the housing project blocker (by legislative design) has many times the power of the housing project promoter in the fist place. Also the five housewives have an immediate (short sighted in this case indeed) stake in blocking the project, while the immigrants increased housing supply benefit (from the specific project) is much more vague and diluted -- yet just as real.A typical similar aspect of these dynamics can be seen at the beginning of the development cycle. Farmer Bob, farmer Jay, farmer Ben, and farmer Don each own one hundred acre farms. Farmer Ben subdivides his land and sells it for housing, initially affordable. Five years later two hundred families move in the area. Five years later the residents create a Save the Bob Jay and Don hillside coalition and under California s *mandated* regional planning can -- and will -- block development on Bob s Jay s and Don s land who, after all, have become a small powerless minority against the holly enviro-nimby alliance of new residents. Suicidal policy aside, one should also not overlook the fact that Bob s, Jay s and Don s land is de-facto in large part confiscated by a majority who does not want the three remaining farmers to do to their land what was done to build the very housing where the current newly established majority just started living in. Such theft is (or perhaps should according to some views) be protected from democracy by the constitution, but apparently, in practice it is not. Indeed, the main function of a constitution in a democracy is a rather broad contract to not screw each other when we get the majoritarian opportunity. At least that is in the American constitution. Without it the feeling that some day, in some way, they will come for you too breaks down societal trust, and the country becomes the basket case nation that is the rule in most of the world where developed nations are the exception -- and developed nations that at least match average world growth (i.e. they are not in decline) are an even rarer exception. Also, not coincidentally, our newly formed one hundred California households and the regional powers granted to them by the legislature can block an even bigger majority of, say, three hundred new very competent households from moving into their area from the Midwest or the rest of the world, thereby blocking the even bigger potential newcomer majority before they even get the chance to vote.So Silicon Valley residents will continue their agonizing commutes from their cubicles to their crummy houses, during which long commutes they will dream of making another half million dollars at their cubicle (that is one million before taxes and deductions) so they can add another bedroom and bath to their old house. Now, mind you, these are supposed to be some of the select smartest people in the nation/world!As is often typical, the roots of a region s decline are inconspicuously established in the success phase of the cycle, when they are hardly noticed. But predicting when exactly the decline starts is virtually impossible for everybody, except perhaps a few very enlightened and also perhaps lucky very intelligent people. Until then the fear of missing out keeps pushing most of us towards inflating the bubble and the eventual destruction of the very area we like.Kartik mentions in his opening statement how odd it is that the technology industry has so little awareness of this. Indeed the general behavior of the electorate on this issue is completely irrational -- and suicidal. Talking about shooting yourself in the wallet.Geoman s comment about his parents illustrates in many ways the case of those offspring that are priced out of the SV area -- and its potential opportunity -- which over a lifetime might have been many times over the parents house cash out, especially in households with more than one child. This blocked migration, in turn, imparts an even larger missed opportunity on the rest of the world who would greatly benefit from the foregone increased innovation, and thus an even faster ATOM. That is quite the epic comment, and you should click on the link and read the whole thread above and below it as well for context. This sort of legislatively-derived resource misallocation, combined with low technological innovation in the construction and communication sectors, has caused this problematic state of affairs and delaying the puncturing of this bubble that has gone on longer than many of us expected. One could say that there are at least four technologies working against this sort of resource misallocation, but at the same time, the centralizing forces of being an economic hub also work in favor of ever-greater centralization. It is time for another ATOM AotM. This month s award has a major overlap with the November 2017 award, where we identified that telescopic power has been computerized, and as a result was rising at 26%/yr. This itself was a finding from a much older article from all the way back in September 2006, where I first identified that telescopic power was improving at that rate. But how do better telescopes improve your life? Learning about exoplanets and better images of stars are fun, but have no immediate relevance to our individual daily challenges. If you are not interested in astronomy, why should you care? Well, there is one area where this advancement has already improved millions and possibly billions of lives : we have now mapped nearly all of the Near Earth Objects (NEOs) that might be large enough to cause a major disaster if any of them strike the Earth. Remember that this is an object with a mass that may be billions of tons, traveling at about 30 km/sec (image from sciencenews.org), of which there are many thousands that have already orbited the sun over 4 billion times each. All of us recall how, in the 1990s, there were a number of films portraying how such a disaster might manifest. Well, in the 1990s, we had little awareness of which objects were nearby at what time, and so there really was a risk that a large asteroid could hit us with little or no warning. However, as telescopes improved, 26%/yr (the square root of Moore s Law, since pixel numbers increase as a square of linear dimension shrinkage) got to work on this problem. Now, as of today, all asteroids larger than 1km are mapped, and almost all of the thousands that are larger than 140m (the size above which it would actually hit the surface, rather than burn up in the atmosphere) are mapped as well (chart from Wikipedia). We have identified which object might be an impact risk in what year. In case you are wondering, there is a 370m asteroid that will get very near (but not hit the Earth) in 2036. Of course, by 2036, we will have mapped everything with far more precision, at this rate of improvement. In other words, don t worry about an asteroid impact in the near future, as none of significance are anticipated in the next 17 years, and probably not for much longer than that. Comets are a different matter, as we have not mapped most of them (and cannot, as of yet), but large ones impact too infrequently to worry about. Hence, the risk of an impact event, and mitigation thereof, is no longer a technological problem. It is merely a political one. Will the governments of the world work to divert asteroids before one hits, or will they only react after one hits in order to prevent the next impact? These questions are complicated, as this problem is completely borderless. Why should the United States pay the majority of the expense for a borderless problem, particularly one that has a 71% chance of hitting an ocean? At any rate, this is another problem that went from deadly to merely one of fiscal prioritization, on account of ATOM progress. More interestingly, within this problem is another major business opportunity that we have discussed here in the past. Asteroid mining is a potential industry that is simultaneous with asteroid diversion, as asteroid pulverization may waste some precious metals that can be captured. Many asteroids have a much greater proportion of precious metals than the Earth s surface does, since precious metals are heavy and most of the quantity sunk to the center of the Earth while the Earth was forming, while an asteroid with much lower gravity has its precious metals more evenly distributed throughout its structure. There are already asteroids identified that have hundreds of tons of gold and platinum in them. Accessing these asteroids will, of course, crush the prices of these metals as traded on Earth (another ATOM effect we have seen elsewhere in other commodities), and may reduce gold to an industrial metal that is used in much the way copper is. This, of course, may enable new applications that are not cost-effective at the current prices of gold, platinum, palladium, etc. But that is a topic for another time. Related :ATOM AotM, November 2017SETI and the SingularityTelescope Power - Yet Another Accelerating Technology Exactly 10 years ago, I wrote an article presenting my own proprietary method for estimating the timeframe of the Technological Singularity. Since that time, the article has been cited widely as one of the important contributions to the field, and a primary source of rebuttal to those who think the event will be far sooner. What was, and still is, a challenge is that the mainstream continues to scoff at the very concept, whereas the most famous proponent of this concept persists with a prediction that will prove to be too soon, which will inevitably court blowback when his prediction does not come to pass. Now, the elapsed 10-year period represents 18-20% of the timeline since the publication of the original article, albeit only ~3% of the total technological progress expected within the period, on account of the accelerating rate of change. Now that we are considerably nearer to the predicted date, perhaps we can narrow the range of estimation somewhat, and provide other attributes of precision. In order to see if I have to update my prediction, let us go through updates on each of the four methodologies one by one, of which mine is the final entry of the four. 1) Ray Kurzweil, the most famous evangelist for this concept, has estimated the Technological Singularity for 2045, and, as far as I know, is sticking with this date. Refer to the original article for reasons why this appeared incorrect in 2009, and what his biases leading to a selection of this date may be. As of 2019, it is increasingly obvious that 2045 is far too soon of a prediction date for a Technological Singularity (which is distinct from the pre-singularity period I will define later). In reality, by 2045, while many aspects of technology and society will be vastly more advanced than today, there will still be several aspects that remain relatively unchanged and underwhelming to technology enthusiasts. Mr. Kurzweil is currently writing a new book, so we shall see if he changes the date or introduces other details around his prediction. 2) John Smart s prediction of 2060 ± 20 years from 2003 is consistent with mine. John is a brilliant, conscientious person and is less prone to let biases creep into his predictions than almost any other futurist. Hence, his 2003 assessment appears to be standing the test of time. See his 2003 publication here for details. 3) The 2063 date in the 1996 film Star Trek : First Contact portrays a form of technological singularity triggered from the effect that first contact with a benign, more advanced extraterrestrial civilization had on changing the direction of human society within the canon of the Star Trek franchise. For some reason, they chose 2063 rather than a date earlier or later, answering what was the biggest open question in the Star Trek timeline up to that point. This franchise, incidentally, does have a good track record of predictions for events 20-60 years after a particular Star Trek film or television episode is released. Interestingly, there has been exactly zero evidence of extraterrestrial intelligence in the last 10 years despite an 11x increase in the number of confirmed exoplanets. This happens to be consistent with my separate prediction on that topic and its relation to the Technological Singularity. 4) My own methodology, which also gave rise to the entire ATOM set of ideas, is due for an evaluation and update. Refer back to the concept of the prediction wall , and how in the 1860s the horizon limit of visible trends was a century away, whereas in 2009 it was in perhaps 2040, or 31 years away. This wall is the strongest evidence of accelerating change, and in 2019, it appears that the prediction wall has not moved 10 years further out in the elapsed interval. It is still no further than 2045, or just 26 years away. So in the last 10 years, the prediction wall has shrunk from 31 years to 26 years, or approximately 16%. As we get to 2045 itself, the prediction wall at that time might be just 10 years, and by 2050, perhaps just 5 years. As the definition of a Technological Singularity is when the prediction wall is almost zero, this provides another metric through which to arrive at a range of dates. These are estimations, but the prediction wall s distance has never risen or stayed the same. The period during which the prediction wall is under 10 years, particularly when Artificial Intelligence has an increasing role in prediction, might be termed as the pre-Singularity , which many people will mistake for the actual Technological Singularity. Through my old article, The Impact of Computing, which was the precursor of the entire ATOM set of ideas, we can estimate the progress made since original publication. In 2009, I estimated that exponentially advancing (and deflation-causing) technologies were about 1.5% of World GDP, allowing for a range between 1% and 2%. 10 years later, I estimate that number to be somewhere between 2% and 3.5%. If we allow a newly updated range of 2.0-3.5% in the same table, and an estimate of the net growth of this diffusion in relation to the growth of the entire economy (Nominal GDP) as the same range between 6% and 8% (the revenue growth of the technology sector above NGDP), we get an updated table of when 50% of the World economy comprises of technologies advancing at Moore s Law-type rates. We once again see these parameters deliver a series of years, with the median values arriving at around the same dates as aforementioned estimates. Taking all of these points in combination, we can predict the timing of the Singularity. I hereby predict that the Technological Singularity will occur in : 2062 ± 8 years This is a much tighter range than we had estimated in the original article 10 years ago, even as the median value is almost exactly the same. We have effectively narrowed the previous 25-year window to just 16 years. It is also apparent that by Mr. Kurzweil s 2045 date, only 14-17% of World GDP will be infused with exponential technologies, which is nothing close to a true Technological Singularity. So now we know the when of the Singularity. We just don t know what happens immediately after it, nor can anyone with any certainty. Related :Timing the Singularity, v1.0The Impact of ComputingAre You Acceleration Aware?Pre-Singularity Abundance MilestonesSETI and the Singularity Related ATOM Chapters :2 : The Exponential Trendline of Economic Growth3 : Technological Disruption is Pervasive and Deepening4 : The Overlooked Economics of Technology Our persistence in contacting the Federal Reserve and urging them to educate themselves on why their outdated assumptions about macroeconomics are not resulting in the outcomes they predict has paid off. While there is no way of knowing which Federal Reserve Governor may have seen some of our emails, or if in fact it is due to our campaign at all, there has been a very sudden paradigm shift at the Federal Reserve on no less than three fronts. According to this article from CNBC, the Federal Reserve has suddenly accepted three points that are extremely familiar to readers of The Futurist and the ATOM publication, but were anathema to the ivory tower orthodoxy of credentialed Macroeconomists.i) The Federal Reserve now admits that a low (3.7%) unemployment rate need not cause inflation, the way it might have in 1969-74.ii) The Federal Reserve now admits that the normal Fed Funds rate may be lower than its previous assumption of 3% (it is actually 0%, but the Federal Reserve is at least moving in the right direction). iii) The Federal Reserve now admits that minimum possible unemployment is lower than the floor they previously assumed.The fact that a body that rigidly disputed all of these notions until just last month has shifted so suddenly is an immense victory for the ATOM, and all of us who took the time to write to the Federal Reserve and teach them about the changes in their field. There is no profession more oblivious to how technology changes their field than Macroeconomists, and this stunning shift is a delight to see. Now, it is time to update the ATOM Publication, for a version 2.0. The recession that we have been perilously close to the brink of from 2017-present (top-to-bottom takes 18-30 months), is now much more likely to be avoided. This is the first time since 2015 that that has been true. Update (8/7/19) : The Federal reserve did indeed cut the Fed Funds rate by 0.25% to bring it down from 2.5% to 2.25%. More importantly, they also signaled an intention to halt Quantitative Tightening (QT), so as to increase net US QE from -$50B/month to $0 (thereby causing a net increase of $50B/month in World QE). The 10-yr Treasury yield has since fallen to 1.6%, making the yield curve even more inverted than before. Naturally, the Fed Funds rate has to return to 0% just to create a normalized yield curve, and QE has to resume to create an effectively negative front end to restore a normal, 2-3% spread yield curve. Related Articles by Chronology :Bond Yields Continue to Confirm ATOM Revelations, July 17, 2016The Federal Reserve Continues to Ignore Technological Deflation, September 22, 2016The Federal Reserve Continues to Get it Wrong, May 19, 2018 The most recent employment report revealed 279,000 new jobs (including revisions to prior months), and an unemployment rate of just 3.6%, which is a 50-year low. Lest anyone think that this month was an anomaly, the last 12 months have registered about 2.6M new jobs (click to enlarge). Over the last two years, the Federal Reserve, still using economic paradigms from decades ago, assumed that when unemployment goes below 5.0%, inflation would emerge. With this expectation, they proceeded on two economy-damaging measures : raising the FF rate and Quantitative Tightening (i.e. reversal of Quantitative Easing, to the tune of $50B/month). As the Fed raised the Fed Funds rate all the way up from the appropriate 0% to the far-too-high 2.5%, the yield on the 10-year note is still 2.1%, resulting in a negative yield curve. Similarly, inflation continues to remain muted, even after $23 Trillion and counting of worldwide QE, as I have often pointed out.Yet, the Federal Reserve STILL wanted to raise interest rates, in direct violation of their own supposed principles regarding both the yield curve and existing inflation. They were exposed as looking at only one indicator : the unemployment rate. Their actions reveal that they think that a low unemployment rate presages inflation, and no other indicator matters. Now, for the big question : Why do they think any UE rate under 5.0% leads to inflation, and why are they getting it so wrong now? The answer is because back in the 1950-80 period, too many people having jobs led to excess demand for materially heavy items (cars, houses, etc.). In those days, there was far too little deflationary technology to affect traditional statistics. Today, people still buy these things, but a certain portion of their consumption (say, 2%) comprises of software. Software consumes vastly less physical matter to deploy and operate, and never runs out of supply , particularly now in the download/streaming era. If Netflix had 10 million new people sign up tomorrow, the cost of servicing them would be very little, and the time spent to sign up all of the new customers would also be negligible. This is not hard to understand at all, except for those who know so much that isn t so . The Federal Reserve has over 600 PhDs, but if they all just cling to the same outdated models and look at just ONE indicator, having 600 PhDs is no better than having one PhD (and, in this case, worse than having zero PhDs). But alas, the Federal Reserve, (and by extension, most PhD macroeconomists) just cannot adjust to this 21st-century economic reality, even if they cannot explain the lack of inflation, and are incurious about why this is. They are afflicted with a level of egghead groupthink the likes of which exceeds what exists in any other major field today. When this happens, we are often on the brink of a major historical turning point. Analogous situations in the past were when the majority of mechanical engineers in the 1880s insisted that heavier-than-air flying machines large enough to carry even a single human were not possible, and when pre-Copernican astronomers believed the Sun revolved around the Earth. The percentage of the total economy that is converging into high-tech (and hence high-deflation) technologies is rising, and is now up to 2.5-3.0% of total world GDP. This disconnect can only widen.President Trump, seeing what is obvious here, has not just pressured the Federal Reserve to stop raising rates (which they were about to do in late 2018, which would have created the inverted yield curve that they supposedly consider to be troubling), but has recently said that the Fed should lower the Fed Funds rate by 1%, effectively saying that their last four rate hikes were ill-considered. He rightfully flipped the script on them. Now, normally I would be the first to say a head of state should not pressure a central bank in any way, but in this particular case, the President is correct, and the ivory-tower is wrong. The correct outcome through the wrong channel is not ideal, but the alternative is a needless recession that damages the financial well-being of hundreds of millions of people, and destroys millions of jobs. He is right to push back on this, and anyone who cares about jobs must hope he can halt and reverse their damage-causing trajectory. In this vein, I urge everyone who is on board with the ATOM concepts, and who wishes to avoid an entirely needless recession, to send polite emails to the Federal Reserve Board of Governors, with a request that they look at the ATOM publication and correct their outdated grasp of monetary effects from liquidity programs, and the necessity of modernizing the field of macroeconomics for the technological age. The website via which to contact them is here :https://www.federalreserve.gov/aboutthefed/contact-us-topics.htm We are at a crucial juncture in the history of macroeconomics, the economics of technology, and the entire concept of jobs and employment. It is a matter of time before a Presidential candidate stands before a cheering audience and points out how trillions of QE were done, but none of the people in the audience got a single dime. Imagine such a candidate simply firing up the audience with queries of Did you get a QE check? Did you get a QE check? ?Usted recibiste un QE cheque? That could be a political meme that gains very rapid momentum. This is how a version of UBI will eventually happen. We, of course, call it something better : DUES (Direct Universal Exponential Stipend). The question is, when least expected, such a leader will emerge (probably not in the US), to transition us to this era of new economic realities. It will certainly be someone from the tech industry (the greatest concentration of people who get it regarding what I have just elaborated above). Who will be that leader? A major juncture of history is on the horizon. All roads lead to the ATOM. Related ATOM Chapters :4. The Overlooked Economics of Technology6. Current Government Policy Will Soon be Ineffective10. Implementation of the ATOM Age for Nations For this month s ATOM AotM, we return to the familiar, but in the process, we want to recognize an ATOM trend that has not gotten as much credit as it has deserved. We all know what Moore s Law is, and what has been enabled by it. But what has always been amazing to me is how little recognition a similar law has received. Storage capacity has risen at a rate equal to (or slightly higher than) Moore s Law. It is not a technological byproduct of Moore s Law, as it has always been worked on by different people in different companies with different technical talents. If storage capacity were not improving at the same rate as Moore s Law, most computer-type products would not have continued to produce decades of viable new iterations. From PCs to Smartphones to Video Game consoles, all have a storage requirement that has to match up to the size and number of files downloaded and processed. Correspondingly, data transfer speeds have also had to rise (USB 1.0 to 2.0 to 3.0, Ethernet to Fast Ethernet to Gigabit Ethernet, etc.). A 2019-era PC could not have a 2006-era hard drive and be very useful. Like Moore s Law, the exponential doubling has spanned a sequence of technologies that all sustain the underlying megatrend, from internal spinning hard drives, to flash storage, to storage in the cloud. Greater density has been matched by shrinking weight per unit storage and less power consumption. This also means that has computing decouples from Moore s Law and moves into different (and probably faster) forms of exponential growth, storage will almost certainly also follow suit. DNA-based storage is a prospective technology that has many attributes comparable to expected future computing technologies such as Quantum Computing. Unlike Moore s Law, storage has not always advanced at a steady rate. There are times when it advanced much faster than Moore s Law, and times when it advanced much slower (such as in recent years). The 40-year average, however, does appear to match Moore s Law s doubling rate rather closely, and hence what one dollar purchases today is the same as what one billion dollars could purchase then, which itself would have been the size of a house. Also unlike Moore s Law, there is not a universally-accepted name associated with this trend. Mark Kryder is sometimes given this attribute, but he did not put forth a prediction early enough for it to be a prediction by any measure (Kryder officially spoke of this in 2005 whereas Moore made his prediction in 1965), his name is not mentioned in any of Ray Kurzweil s writings or other publications, and since he was not the founder of a major storage company, he is not analogous to Gordon Moore. As rising storage efficiency is crucial towards the productization of any other form of computing product (including Smartphones), it deserves recognition for its contribution to the technological age, despite often being overlooked in favor of Moore s Law. Related ATOM Chapters :3. Technological Disruption is Pervasive and Deepening For this month s ATOM AotM, we examine something that even the rest of the technology industry has virtually no awareness of, and the US public is entirely oblivious of, even though we have a President from the construction industry. The US construction industry has had no net productivity gain in the last 70 years. Even worse, it declined by 50% over the last 50 years. Construction should be seen as a type of manufacturing, as most construction is not devoted to anything highly customized or unusually complex. Yet, manufacturing itself has risen in productivity by 800% over the same period that construction has not risen at all. A combination of organized crime, government graft, and an anti-productivity ethos have contributed to this epic failure. Given that construction is about 7% of the US economy, this is troubling. Imagine if that 7% was 16x more productive (i.e. merely keeping up with manufacturing). Americans, particularly urban Americans, don t realize that they could have thrice the square footage for the same price if this sector merely kept up with manufacturing. There would also be several hundred thousand more jobs in construction, and much broader home ownership. Meanwhile, outside of the many biases of the Western media, there is an amazing example of supreme construction efficiency. China has grown at 7% a year over the last 20 years, even as the US has shrunk at -1% a year (click image to enlarge). The productivity of China has greatly enlarged the size of its construction sector, to the extent that it is 20% of China s economy vs. just 7% in the US. While the two countries are a different stages of growth and China is still at a much lower absolute level, the differential is still immense. Normally, in any industry, such an immense productivity differential leads to the productive country exporting products to the less productive country, swiftly driving local unproductive businesses to their deserved demise. Construction, however, produces a product that is not transportable, so a productivity normalization has not happened. At least not yet. But this high of a differential eventually finds a way to engineer a normalization. Modular construction is one method where parts are manufactured, and then assembled on site. China could start exporting this to the rest of the world. Here is a Spire Research report on the advances in China s construction technology.The Western media, in its hubris, is quite willing to criticize China for building entire cities 10 years before they are needed. How often have we seen stories about empty cities in China that take a few years to fill up? By contrast, the United States (and California in particular) does something much worse, which is to build structures 20 years after they are needed. Given the choice between these two schedule misalignments, China s approach is vastly preferable. Beyond this, the costs of US ineptitude are about to become more problematic. The eCommerce revolution is exposing the massive misallocation of land toward retail space, that is a uniquely American distortion. Part of this is due to a peculiar depreciation schedule in the tax code originating in 1954. The abundant land in the US interior led to the same lopsided usage of land in California, leading to the grotesque situation we have today where ultra-expensive housing resides next to vast, empty parking lots. High California housing prices are the product of extreme artificial supply restriction, aided by low construction productivity that ensures an apartment complex takes three years to complete, where the same in China takes under one year. Dramatic photos of dead malls can easily evoke emotions in the average American, who has been trained to think this sort of retail experience is normal. But charts that reveal the unique extent of US profligacy with regards to retail land reveals a much more logical sequence of impending events. As eCommerce continues to shutter brick and mortar retail, there will be a rising groundswell of pressure to repurpose this land for a more contemporary use. Unfortunately, the inadequate level of US construction productivity threatens to greatly delay this conversion, severely damaging our national competitiveness relative to China. On the subject of where the US may see China catch up, most of the focus is on Artificial Intelligence, Quantum Computing, and other high-concept technologies. Yet the construction productivity differential alone represents the single biggest sectoral deficit from the point of view of the US and many other countries. China is well-positioned to dominate the entire construction industry worldwide once it can more easily win international contracts and transplant its productivity practices abroad. If the US blocks Chinese construction imports, other countries across the world will happily partake in these high-quality end products. This should be welcomed by anyone with a free-market bent. For this reason, China s construction sector, in breaking the low-productivity pattern seen in almost the entire rest of the world, is the recipient of the February 2019 ATOM AotM. Commenter Geoman alerted me to a lengthy article at Seeking Alpha by Ramy Taraboulsi. It is spread across 29 different pages, and thus not the most suited to that format. I have been reading it, and the similarities with the ATOM publication are many.Given its length, I want to get some more eyeballs on it, so I hope that some of my readers here can examine it and point out salient differences, and possible points that might strengthen the ATOM publication. I don t want to bias readers with any comments before they read it themselves. This coincides with my intention to update it for 2019 with the two major new ideas that have emerged since initial publication (the Sovereign Venture Fund and Monetization of Data generated through DUES spending). There is room for incorporation of other suitable ideas as well, if sufficiently stress-tested. I had a new article for the February 2019 ATOM AotM completed and ready to go, but I want to hold off on that for a bit until I get some more readers comments on Ramy Taraboulsi s article. It is time for another ATOM AotM. This month, we return to the energy sector, for it is where the greatest size and scale of ATOM disruptions are currently underway.We visited batteries briefly in August 2017 s ATOM AotM. There are two exponentials here, battery cost per unit of energy, and battery energy density per unit volume. Hence, despite 40 years of apparent stagnation interspersed with angst about how electric vehicles failed to arrive in response to 1973 and 1981 oil spikes, the exponential trend quietly progressed towards the inflection point that we have arrived at. True to the exponential nature of technology, more progress happened in 2011-20 than 1970-2010, and we now have viable electric vehicles that are selling rapidly and are set to displace gasoline consumption in a matter of just a few short years. Electric vehicles are now 2% of all new vehicle sales in the US, and 3% worldwide, with a high annual growth rate. Due to the rapid cost improvements in EVs expected in the next three years, a substantial tipping point is perhaps no more than three years away. This rapid rise is in the face of two major headwinds : the low price of oil (due to another ATOM disruption), and the high price of EVs (the top-seller in units is a $50,000 vehicle). It is now a certainty that once a high-quality EV becomes available at the $30,000 pricepoint, the speed of displacement will be startling.A tracker that records monthly sales at both US and WW levels is here. The speed of advancement merits monthly visits to this tracker, at this point. Note that over time, the US is actually where total displacement of ICEs by EVs will be the slowest, since other countries are more suited for EVs than the US (they have higher gasoline prices, and often 220V electrical outlets that lead to faster charging). In fact, a suddenly popular home upgrade in the US is, ironically, the installation of 220V outlets in the garage, specifically for EV charging. As an example of a true ATOM disruption, the transformation will be multi-layered. From oil import/export balances, to gasoline refinement and distribution networks, to the reduction of near-slave labor from the Subcontinent forced to find work in Gulf petrostates, to mechanics dependent on ICE vehicle malfunctions, to surplus used ICEs unable to sell and thus forced to slash prices, to power management and pricing by electric utilities, to prime land occupied by gas stations, a variety of status quos will end. Don t underestimate how soon the domino effect will take place. Once EVs are sufficiently mainstreamed in the luxury car market (which is set to happen in 2019), then the entire range of urban commercial/government vehicles will swiftly transition to electric. The US has 800,000 police cars, 210,000 USPS vans, and a variety of other government vehicles. On top of that, private enterprises include 110,000 UPS vehicles, 60,000 FedEx vehicles, and perhaps over 300,000 pizza delivery vehicles. As these transition, observe how many gasoline stations shutter. The much greater lifespan of EVs relative to ICE s will be one of the four factors that lead to the majority of automobile use migrating to an on-demand, autonomous vehicle model by 2032, as discussed before. Related :The End of PetrotyrannyWhy I Want Oil to be $120/Barrel Related ATOM Chapters :3. Technological Disruption is Pervasive and Deepening. For this month s ATOM AotM, we visit the medical industry, and examine a technology that seems quite intuitive, but on account of patents and other obstacles, has seen rapid improvement greatly delayed until now. Surgery seems as though robotics would be ideally suited for it, since it combines complexity and precision with a great deal of repetition of well-established steps. The value of smaller incisions, fewer instances of bones being sawed, etc. is indisputable, from qualitative measures such as healing pain, to tangible economic metrics such as hospital stay duration post-surgery. Intuitive Surgical released its Da Vinci robot to the market in 2001, but on account of Intuitive s patents, they sustained a monopoly and did not improve the product much over the subsequent 17 years. Under ATOM principles, this is a highly objectionable practice, even if technically they can still earn a high profit margin without any product redesigns. As a result, only 4000 such robots are currently in use, mostly in the US. Intuitive has achieved a market capitalization of over $60 Billion, so it has succeeded as a business, but this may soon change. Now that Intuitive s patents are finally close to expiry, a number of competitors are ready to introduce ATOM-consistent exponential improvements into the competitive landscape. The Economist has a detailed article about the new entrants into this market, and the innovations they have created. In addition to mere cost-reduction due to smaller electronics, one obvious extension of the robotic surgery model is for each robot to be connected to the cloud, where the record of each surgery trains an Artificial Intelligence to ensure ever-improving automation for several steps of the surgery. With AI, greater usage makes it improve, and when thousands of surgeries around the world are all recorded, that makes each machine simultaneously better. As costs lower and unit volume increases, the volume of data generated rises. As the accumulation of data rises, the valuation of companies capturing this data also rises, as we have seen in most other areas of technology. This level of data combined with greater circuitry within the robot itself can also increase the speed of surgery. When more of it is automated, and the surgeon is doing less of the direct manipulation, then what is to prevent surgeries from being done at twice or thrice the speed? This enables a much shorter duration of anesthesia, and hence fewer complications from it. If we could point to one aspect that makes the modern era different from centuries past, the premier candidate for that distinction is how the centuries-established exponential, accelerating trend of technological progress manifests in economics, and the fact that the trendline is now in a steep upward trajectory. These are all worldwide metrics, and have to be. But if one examines the components, the variance contained therein is immense.One table that I use relatively often is the one that depicts relative GDP gain by country, and have in the past used it to describe how the 2008-09 crisis led to the rebound happening elsewhere. Google has just updated its economic data engine for 2017, enabling a full decade to be included from the start of the prior crisis. This enables us to see what happens when the global economy experiences a major dislocation. The Great Depression (1929-39) was one such dislocaton, and while the trendline is too steep today for a downturn of similar duration to manifest in the global economy, the more recent dislocation was almost as dramatic in terms of how it reoriented the tectonic plates of the global economy. From the table, we see that the World Economy grew by 40% in Nominal GDP. We do not adjust for inflation in these metrics for reasons detailed in the ATOM publication, and we take the US$ metric as universal. The US, remarkably, did not grow at a much slower rate than the world average, and hence has not yet experienced a substantial proportional shrinkage. By contrast, the rest of the advanced world has scarcely grown at all, while European economies have outright shrunk. An advanced country, of course, does not have the same set of factors to contend with as an emerging economy that is at a stage where high growth is easier, hence this is really two tables in one. India s underperformance relative to China is just as substandard as the UK s underperformance relative to the US. China has effectively dominated the entire world s growth. China has grown at an astounding 245%, partly due to a structural strengthening of its currency, which itself is partly due to their more advanced understanding of technological deflation and the monetization of such through their central bank (as per the ATOM concepts). India has not experienced any such strengthening of its currency (quite the opposite, in fact), which is why India s economy has grown at a far slower rate despite starting from a very low base. Consider this other chart, of GDP distribution by country (as per the current borders) from the year 1 until 2017. The growth of China (and to a lesser extent, India) appears to be a reversion to a status quo that existed from the dawn of civilization all the way until the early 19th century. If this factor is combined with the exponential trend of world growth, then China s current outperformance seems less like an aberration. This begs the question of what the next decade will look like. There is almost no chance that China can outperform the RoW by the same magnitude from this point onwards, simply due to the RoW no longer being large enough to manage the same intake of Chinese exports relative to China s size as before. But will the convergence take the form of China slowing down or the RoW speeding up? Will India experience the same convergence to pre-19th century proportional size, or is India a lost cause?Under the ATOM program, it could certainly be the latter, since the advanced economies already have enough technological deflation that they can monetize it through central bank monetary creation. China, by contrast, will not be technologically dense enough for it until 2024 or so. The US could rise to 5-6%/year Real GDP growth by 2025. The current mindset in the Economics profession is vastly outdated, and there is little to no curiosity about accelerating economic growth rates, or about the relationship between technological deflation and central bank monetary action. If China can no longer be an outlet to accommodate the entirety of the trendline reversion force that is seeking to work around these obstructions, then explosive growth combined with chaotic disruption will happen somewhere else. Related ATOM Chapters :2. The Exponential Trendline of Economic Growth For the May 2018 ATOM AotM, we will visit a technology that is not a distinct product or company, but rather is a feature of consumer commerce that we would now find impossible to live without. This humble yet indispensable characteristic of multiple websites has saved an incalculable amount of frustration and productivity loss. I am, of course, referring to web-based reviews. Lest you think this is a relatively minor technology to award an ATOM AotM to, think again, for a core principle of technological progress is that a technology is most successful when it is barely even noticed despite a ubiquitous presence. Part of what has enabled eCommerce to siphon away an ever-rising portion of brick and mortar retail s revenue is the presence of reviews on sites like Amazon. Beyond eCommerce, sites like Yelp have greatly increased the information access of consumers seeking to patronize a low-tech business, while media sites permit a consumer to quickly decide which films and video games are worthwhile without risking a blind purchase. While false reviews were a feature of the early Internet for over a decade, now there is considerable ability to filter those out. I recall a frustrating episode that a friend and I experienced in 1999. We wanted to rent a film from Blockbuster videos, but did not know which one. We found one that had familiar actors, but the movie was extremely subpar, resulting in a sunk cost of the rental fee, transportation costs, and time spent on the film and two-way transit. When returning to Blockbuster to discharge the VHS Cassette of the film, we selected another, based on the same criteria. It was even worse. We had rented two separate films over two separate round trips to Blockbuster, only to be extremely unsatisfied. Movie review sites like IMDB did exist at the time, but my friend did not have home Internet access (as his Internet activities were restricted to his workplace, as was common at the time). Now, in this anecdote, just list the number of ATOM disruptions that have transpired since :There is no longer a Blockbuster Video that rents VHS Cassettes, as films are rented online or available through a Netfilx subscription.Everyone has home Internet access, and can see a film s reviews before ever leaving home.Hence, it is no longer possible to waste hours of time and several dollars on a bad film. The same goes for restaurants, and in this case, both the consumer and the business are shielded from an information mismatch on the part of the consumer. I have always felt that it was unfair for a patron to judge a restaurant negatively if they themselves did not order what they might have liked. Now, with Yelp, in addition to reviews, there are pictures, enabling a vastly more informed decision. Even higher-stakes decisions, such as the selection of a dentist or auto-mechanic, has slashed the uncertainty that people lived under just 12 years ago. The better vendors attract more business, while substandard (or worse - unethical) vendors have been exposed to the light of day. This is a more powerful form of quality control than has ever existed before.Now, to see where the real ATOM effects are found, consider the value of the data being aggregated. This drives better product design and better marketing. This also expands the roadmaps of accessory products or complementary products. The data itself begins to fuel artificial intelligence, for remember that any pile of data of sufficient size tends to attract artificial intelligence to it. This leads to a lot of valuable analytics and automation. If one were to rank the primary successful Internet use cases to date, the ability to see reviews of products and services would rank very high on the list. For this reason, this receives the May 2018 ATOM AotM. In the ATOM publication, we examine how the only way to address the range of seemingly unrelated economic challenges in a holistic manner is to monetize technological deflation. For reasons described therein, the countries best suited to do this are small countries with high technological density. Furthermore, we examine the importance of the first-mover advantage, where when a country can monetize the technological deflation in the rest of the world for the benefit of their domestic economy, the first $1 Trillion is practically free money. In Chapter 10, I outline a systematic program for how the US could theoretically transition to this modernization of the economy. But I then identify the four countries that are much more suitable than the US. These are two Western democracies (Canada and Switzerland) and two Pacific-Rim city states (Singapore and Hong Kong). But it is possible to create custom solutions for more countries as well. To determine how to do that, let us go back to a seminal event in the emergence of these ideas. What Japan Discovered for the Benefit of Humanity : Few people have any awareness of what an important event happened in April of 2013. Up to that time, the US was the only country that had embarked on a program to engineer negative interest rates through monetary creation (rather than the punitive and reductive practice of deducting from bank accounts). Japan decided that after two decades of stagnation and extremely low interest rates, something more drastic and decisive had to be done. The early success of the US Quantitative Easing (QE) program indicated that a more powerful version of this could be effective against the even worse stagnation that Japan s economy was mired in. In April of 2013, the Bank of Japan (BoJ) decided to go big. They embarked on a program of monetary easing in the amount of 30% of their annual GDP. This was a huge upgrade over the US QE programs for two reasons. Firstly, it was much larger as a proportion to the host nation s GDP, and secondly, it had no end date, enabling long-term decisions. Since the formal economics profession in the West is burdened by a wide range of outdated assumptions about money printing, inflation, and technology, the Western Economists yet again predicted high inflation. And yet again, they were wrong. There was no inflation then at the start of the program. Japan had correctly called the bluff of the inflation specter. The third-largest economy in the world could print 30% of its GDP per year for five years, and still experience no inflation. When I observed this, I drew the connection between technological deflation (worldwide) and the vanishing QE (also worldwide). Most of Japan s QE was flowing outside of Japan (and indeed into the US, which had long since stopped QE, and has forestalled a major market correction only by drawing from overseas QE, mainly from Japan). Hence, the combined QE of the world was merely offsetting the technological deflation worldwide. Japan s big gambit proved this, and in doing so, they showed us how much QE can be done before world inflation even hits 3% (i.e. much more than formal economists thought). What is a Small, Prosperous Country to do? While it is always better to be a prosperous country than an impoverished one, almost every small country (the size of Canada or smaller) is faced with a major vulnerability in the modern economy. Their economy invariably depends on one or two major industries, and is hence vulnerable to a technological disruption that arises from somewhere else in the world. The need to diversify against such external risks is obvious, but most countries are not on the best path to achieve this goal. These days, everyone I meet from the government of some foreign country seems to have the same goal for their country - to create an ecosystem of local technology startups. This goal is not just extremely difficult to attain, but it is very misguided. Technology is becoming increasingly governed by winner-take-all dynamics and capital concentration, which means even in the US, rival cities are unable to compete with Silicon Valley (which itself has concentrated into a smaller portion of the San Francisco Bay Area than was the case in the late 1990s). Small countries with technology sectors, such as Israel and Singapore, started decades ago and have a number of unique factors in their favor, including a major Silicon Valley diaspora. Hence, a country that thinks it is productive to create a tech startup cluster in their countries will almost certainly create a situation where young people receive training at local expense, only to leave for Silicon Valley. So these initiatives only end up feeding Silicon Valley at the expense of the original country. Even if a few tech startups can be forcibly created in the country, it is extremely unlikely that they will achieve any great size within even 15 years.Take, for example, a country like New Zealand. It has many favorable characteristics, but certain disadvantages as well in an increasingly globalized economy. It relies on agricultural and dairy exports, as well as the film industry and tourism. It is too remote to easily plug into the well-traveled routes of tech executives (less than 30M people live within 3000 miles of New Zealand) or major supply chains. It is too small to be a significant domestic market for tech (particularly when a functional tech ecosystem has to comprise of startups in multiple areas of tech in order to achieve rudimentary diversification). New Zealand s success in getting Hollywood films shot in New Zealand cannot similarly translate into getting some Silicon Valley business, as an individual film project has a short duration and distinct ending, with key personnel on site for just a brief period. Technology, by contrast, is inherently endless, and requires interdependency between many firms that have to have co-location. Furthermore, no society is capable of placing more than 1-2% of its population into high-tech professions and still have them be competitive at the international level (most tech innovation is done by people in the top 1% of cognitive ability). For this reason, a tech startup ecosystem does not create broad prosperity (it is no secret that even within Silicon Valley, only a fraction of people are earning almost all the new wealth. Silicon Valley has among the most extreme inequality found anywhere). Now, from the research contained in the ATOM publication, we know that there is a far easier solution that can deliver benefits in a much shorter time. New Zealand s fiscal budget reveals that as of 2018, it collects about $80 Billion in taxes and spends the same $80B per year. The world was recently generating $200B/month in QE and is still doing an insufficient $120B/month. The entire annual budget of New Zealand is well below one month of the world s QE - the QE that is needed just to halt technological deflation. It would be very easy for New Zealand to waive all income taxes, and merely print the same $80B/year from their central bank. A brief transition period can be inserted just to soften the temporary downgrades that international rating agencies deliver. But the waiver of income tax will boost New Zealand s economy with immediate effect. It can even enter and dominate the lucrative tax-haven industry until other countries adopt the same strategy.As we know, it is difficult for government officials, legislators, and statesmen to take such a drastic step, particularly when the entire Economics profession is still mired in outdated thinking about how QE will someday, somehow cause inflation (despite being wrong about this for 9 years and over the course of $20 Trillion in cumulative world QE). For this reason, a second, less drastic option is also available for New Zealand. That involves create what I describe as a Sovereign Venture Fund, where the New Zealand Central Bank creates a segregated account that is completely partitioned off from the domestic economy, and prints money to place into that account (say, $100 Billion). It is crucial that this money not circulate domestically at first, as it would cause inflation. The purpose of this $100B Sovereign Venture Fund is to invest in startups worldwide that might be disrupting New Zealand s domestic industries. This model is extremely effective and flexible, as :i) The money was not taken from New Zealand taxpayers, but rather generated for free by the New Zealand Central Bank. Hence, it can invest in speculative startups across the world with far more boldness.ii) The diversification achieved is immediate, and can always be adjusted with equal immediacy as needed. iii) The Fund is leveraging the rest of the world s technological deflation for New Zealand s domestic benefit. iv) Tech startups worldwide become extremely vocal advocates for the fund, and even the country itself. It boosts New Zealand s branding (generating even more tourism). v) Fund gains can be used to offset government spending by replacement of income tax, or to fund training to enable citizens to modernize their skills. It can also provide a greater social safety net to cushion industries buffeted by disruption, but without taxing those who are still working. This is how to repatriate the money without inflation. vi) Even a larger fund of $800B can earn $80B/year from a 10% return, which exceeds the total taxes collected by the country. The Sovereign Venture Fund is an extremely effective, speedy, and versatile method of economic diversification. It can be customized for any prosperous country (for example, an oil exporter should simply invest in electric vehicle, battery, and photovoltaic technologies to hedge their economic profile). As a huge amount of worldwide QE has to be done just to offset technological deflation, there is no contribution to inflation even worldwide, let alone domestically. As the winds of technological change shift, the Fund can respond almost immediately (unlike a multi-decade process of creating a tech startup ecosystem only to worry if the sectors represented are about to be disrupted). Since there is a very high and exponentially rising ceiling of how much world QE can be done before world inflation reaches even 3% (about $400B/month in 2018, as per my calculations), there is an immense first-mover advantage that is possible here. The first $1 Trillion is effectively free money for the country that decides to be Spartacus. New Zealand, in particular, has even more factors that make it a great candidate. The NZ$ is currently too strong, which is crimping New Zealand s exports. This sort of program may create a bit of currency weakening just from the initial reaction. For this additional reason, it is a low-risk, high-return strategy for generating a robust and indeed indestructible safety net for New Zealand s citizens, hedging them from the winds of global technological disruption. Related ATOM Chapters :Chapter 4 : The Overlooked Economics of TechnologyChapter 10 : Implementation of the ATOM Age for Nations For this month s ATOM AotM, we will address the sector that any thought leader in technological disruption recognizes as the primary obstruction to real progress. When we see that sectors that are overdue for disruption, such as medicine, education, and construction all happen to be sectors with high government involvement, the logical progression leads us to question why government itself cannot deliver basic services at comparable costs to private sector equivalents. As just one example out of hundreds, in California and other high-tax states, the annual license plate registration can cost $400/year. What does the taxpayer truly receive? The ability to trace license plates to driver s licenses and insurance. Why should such a simple system cost so much? It seems that it should only cost $2/year under 2018 technological levels. By contrast, note how much value you receive from a $96/year Netflix subscription. By all accounts, many basic government services could easily implement cost reductions of 98-99%. While the subject of government inefficiency vs. the ATOM is perhaps the primary topic of this website and the ATOM publication, just one small example across the world demonstrates what a modernized government looks like. The tiny country of Estonia contains just 1.3 million people. A desire to catch up from decades of being part of the Soviet Union perhaps spurred them into a unique desire to modernize and digitize government services into a state that Americans would scarcely believe could exist. Here are some articles by publication about Estonia s successful digitization, where you can read about specific details :The New YorkerThe AtlanticFortuneEstonia has also taken early steps towards certain ATOM realities. While it does have high consumption taxes, income tax is a flat 21%, thereby saving immense costs in complexity and processing (which cost the US over $700 Billion/yr). If only it figures out the ATOM principles around monetization of technological deflation, it could reduce income taxes to zero. Now, for the unfortunate part. When a country manages to produce a product or service that the rest of the world wants and cannot produce at the same quality and price themselves, the first country can export the product to the outside world. From Taiwanese chipsets to South Korean smartphones and television sets to Italian cheeses, the extension of sales to exports is straightforward. Yet in the governance sector, despite being a third of the world economy, Estonia has no market where it can sell its services to hasten the digitization of other governments. Whether at the Federal, State, City, or County level, the United States has hundreds of governments that could simply hire Estonian consultants and implementation staff to rapidly install new services. This could be lucrative enough to make Estonia a very wealthy country, and then attract competition from other countries (such as nearby Finland, which is attempting to follow Estonia s path). Yet, unlike a private sector product or service, governance just does not value efficiency or productivity to this extent. The State of California alone could save billions of dollars per year, and either spend the taxes on other things, or (preferably) pass the savings on to the taxpayers.Before long, the ATOM will force even the largest nation states to improve their productivity of government services. But that process will be messy, and government officials may take a scorched-earth approach to defending their own rice bowls. Let us hope that Estonia inspires at least a few other countries into voluntary modernization. Related ATOM Chapters :10. Implementation of the ATOM Age for Nations. With the new year, we have a new ATOM AotM. This is an award for a trend that ought to be easy for anyone to recognize who is at all familiar with Moore s Law-type concepts, yet is greatly overlooked despite quite literally being in front of people s faces for hours a day. The most crude and uninformed arguments against accelerating technological progress are either of a Word processing is no better than in 1993, so Moore s Law no longer matters or People can t eat computers, so the progress in their efficiency is useless nature. However, the improvements in semiconductor and similar technologies endlessly finds ways into previously low-tech products, which is the most inherent ATOM principle. The concept of television has altered cultures across the world more than almost any other technology. The range of secondary and tertiary economies created around it are vast. The 1960 set pictured here, for $795, cost 26% of US annual per capita GDP at the time. The equivalent price today would be $15,000. Content was received over the air and this was often subject to poor reception. The weight and volume of the device relative to the area of the screen was high, and the floorspace consumed was substantial. There were three network channels in the US (while most other countries had no broadcasts at all). There was no remote control. There were slow, incremental improvements in resolution and screen-size-to-unit-weight ratios from the 1960s until around 2003, when one of the first thin television sets was available at the retail level. It featured a 42 screen, was only 4 inches thick, and cost $8000. Such a wall-mountable display, despite the high price, was a substantial improvement above the cathode ray tube sets of the time, most of which were too large and heavy to be moved by one person, and consumed a substantial amount of floor space. But in true ATOM exemplification, this minimally-improving technology suddenly got pulled into rapid, exponential improvement (part of how deflationary technology increased from 0.5% of World GDP in 1999 to 1% in 2008 to 2% in 2017). Once the flat screen TV was on the market, plasma and LCD displays eventually gave way to LED displays, which are a form of semiconductor and improve at Moore s Law rates. Today, even 60-inch sets, a size considered to be extravagant in 2005, are very inexpensive. Like any other old electronic device, slightly out of date sets are available on Craigslist in abundance (contributing to the Upgrade Paradox). A functional used set that cost $8000 in 2003 can hardly be sold at all in 2018; the owner is lucky if someone is willing to come and take it for free. Since once ATOM-speed improvements assimilate a technology, the improvements never stop, and sets of the near future may be thin enough to be flexible, along with resolutions of 4K, 8K, and beyond. Sets larger than 240 (20 feet) are similarly declining in price and visible in increasing numbers in commercial use (i.e. Times Square everywhere). This is hence one of the most visible examples of ATOM disruption, and how cities of today have altered their appearance relative to the recent past. This is a large ATOM disruption, as there are still 225 Million new sets sold each year, amounting to $105 Billion/year in sales. Related :The Impact of Computing Related ATOM Chapters :3. Technological Disruption is Pervasive and Deepening. I have been selected to teach a class at Stanford Continuing Studies, titled The New Economics of Technological Disruption . For Bay Area residents, it would be great to see you there. There are no assignments or exams for those who are not seeking a letter grade, and by Stanford standards, the price ($525 for an 8-week class) is quite a bargain. 35 44 students have already signed up. See the course description, dates, and more. Inch by inch, the ATOM is reaching more people. I was invited back to Reference Point a second time to discuss the ATOM :I was also back on FutureTalk a second time to discuss blockchain and cryptocurrencies :Remember that older media content for the ATOM is here. For this month, the ATOM AotM goes outward. Much like the September ATOM AotM, this is another dimension of imaging. But this time, we focus on the final frontier. Few have noticed that the rate of improvement of astronomical discovery is now on an ATOM-worthy trajectory, such that this merited an entire chapter in the ATOM publication. Here at The Futurist, we have been examining telescopic progress for over a decade. In September of 2006, I estimated that telescope power was rising at a compounding rate of 26%/year, and that this trend has been ongoing for decades. 26%/year happens to be the square root of Moore s Law, which is precisely what is to be expected, since to double resolution by halving the size of a pixel, one pixel has to be divided into four. This is also why video game and CGI resolution rises at 26%/year. Rising telescope resolution enabled the first exoplanet to be discovered in 1995, and then a steady stream after 2005. This estimated rate led me to correctly predict that the first Earth-like planets would be discovered by 2010-11, and that happened right on schedule. But as with many such thresholds, after initial fanfare, the new status quo manifests and people forget what life was like before. This leads to an continuous underestimation of the rate of change by the average person. Then, in May 2009, I published one of the most important articles ever written on The Futurist : SETI and the Singularity. At that time, only 347 exoplanets were known, almost all of which were gas giants much larger than the Earth. That number has grown to 3693 today, or over ten times as many. Note how we see the familiar exponential curve inherent to every aspect of the ATOM. Now, even finding Earth-like planets in the life zone is no longer remarkable, which is another aspect of human psychology towards the ATOM - that a highly anticipated and wondrous advance quickly becomes a normalized status quo and most people forget all the previous excitement. The rate of discovery may soon accelerate further as key process components collapse in cost. Recent computer vision algorithms have proven themselves to be millions of times faster than human examiners. A large part of the cost of exoplanet discovery instruments like the Kepler Space Observatory is the 12-18 month manual analysis period. If computer vision can perform this task in seconds, the cost of comparable future projects plummets, and new exoplanets are confirmed almost immediately rather than every other year. This is another massive ATOM productivity jump that removes a major bottleneck in an existing process structure. A new mission like Kepler would cost dramatically less than the previous one, and will be able to publish results far more rapidly. Given the 26%/year trendline, the future of telescopic discovery becomes easier to predict. In the same article, I made a dramatic prediction about SETI and the prospects of finding extraterrestrial intelligence. Many enlightened people are certain that there are numerous extraterrestrial civilizations. While I too believed this for years (from age 6 to about 35), as I studied the accelerating rate of change, I began to notice that within the context of the Drake equation, any civilization even slightly more advanced than us would be dramatically more advanced. In terms of such a civilization, while their current activities might very well be indistinguishable from nature to us, their past activities might still be visible as evidence of their existence at that time. This led me to realize that while there could very well be thousands of planets in our own galaxy that are slightly less advanced that us, it becomes increasingly difficult for there to be one more advanced than us that still manages to avoid detection. Other galaxies are a different story, simply because the distance between galaxies is itself 10-20 times more than the diameter of the typical galaxy. Our telescopic capacity is rising 26%/year after all, and the final variable of the Drake equation, fL, has risen from just 42 years at the time of Carl Sagan s famous clip in 1980, to 79 years now, or almost twice as long. Hence, the proclamation I had set in 2009 about the 2030 deadline (21 years away at the time) can be re-affirmed, as the 2030 deadline is now only 13 years away. Despite the enormity of our galaxy and the wide range of signals that may exist, even this is eventually superseded by exponential detection capabilities. At least our half of the galaxy will have received a substantial examination of signal traces by 2030. While a deadline 13 years away seems near, remember that the extent of examination that happens 2017-30 will be more than in all the 400+ years since Galileo, for Moore s Law reasons alone. The jury is out until then. (all images from Wikipedia or Wikimedia). Related Articles :New Telescopes to Reveal Untold WondersSETI and the SingularityTelescope Power - Yet Another Accelerating Technology Related ATOM Chapters :12. The ATOM s Effect on the Final Frontier For this month, the ATOM AotM goes to an area we have not visited yet. Enterprise software and associated hardware technologies may appear boring at first, but there is currently a disruption in this area that is generating huge productivity gains. Amazon Web Services (AWS) is an ever growing list of services that replaces computing, storage, and networking expenditures at client companies. At present, over 90 different services are available. Here is a slideshow of the various companies and sectors being disrupted by AWS. loud computing itself is relatively new, but this revolution by Amazon as taken direct slices out of the existing businesses of Microsoft, IBM, and Oracle, which were slow to deploy cloud-based solutions since they wanted to extend the lives of their existing product lines. Their anti-technology behavior deserves to be punished by the ATOM, and Amazon obliged. AWS is set to register $14 Billion in revenue for 2017, most of which has replaced a greater sum of revenue at competing companies. The biggest value is the lower cost of entry to smaller companies from the on-demand flexibility enabled by AWS. Now that IT Security and Compliance is far more cost-effective through AWS, the barrier to entry for smaller firms is lowered. This is particularly useful for clients in far-flung locations, enabling a decentralization that facilitates greater technological progress. Upgrades across computing, storage, software, networking, and security are disseminated seamlessly, and since far less hardware is used, the upgrade process is far more materially efficient. This removes a variety of smaller bottlenecks to technological progress, mitigating the corporate equivalent of the Upgrade Paradox. Another great benefit is elasticity, where a company does not have to worry about estimating hardware capacity needs in the future, which can often lead to overbuying of rapidly deflating technologies, or underbuying, which can cause customer dissatisfaction due to slow speeds. All of this can now be scaled dynamically through AWS. For the productivity gains inherent to the scale and dynamism of AWS, it receives the October 2017 ATOM AotM. Related ATOM Chapters :3. Technological Disruption is Pervasive and Deepening. For September 2017, the ATOM AotM takes a very visual turn. With some aspects of the ATOM, seeing is believing. Before photography, the only image capture was through sketches and paintings. This was time-consuming, and well under 1% were prosperous enough to have even a single hand-painted portrait of themselves. For most people, after they died, their families had only memories via which to imagine faces. If portraits were this scarce, other images were even scarcer. When image capture was this scarce, people certainly had no chance of seeing places, things, or creatures from far away. It was impossible to know much about the broader world. The very first photograph was taken as far back as 1826, and black white was the dominant form of the medium for over 135 years. That it took so long for b w to transition to color may seem quite surprising, but the virtually non-existent ATOM during this period is consistent with this glacial rate of progress. The high cost of cameras meant that the number of photographs taken in the first 100 years of photography (1826-1926) was still an extremely small. Eventually, the progression to color film seemed to be a completion of the technological progression in the minds of most people. What more could happen after that? But the ATOM was just getting started, and it caught up with photography around the turn of the century with relatively little fanfare, even though it was notable that film-based photography and the hassles associated with it were removed from the consumer experience. The cost of film was suddenly zero, as was the transit time and cost from the development center. Now, everyone could have thousands of photos, and send those over email endlessly. Yet, standalone cameras still cost $200 as of 2003, and were too large to be carried around everywhere at all times. As the ATOM progressed, digital cameras got smaller and cheaper, even as resolution continued to rise. It was discovered that the human eye does in fact adapt to higher resolution, and finds previously acceptable lower resolution unacceptable after adapting to higher resolution. Technology hence forces higher visual acuity and the associated growth of the brain s visual cortex. With the rise of the cellular phone, the ATOM enabled more and more formerly discrete devices to be assimilated into the phone, and the camera was one of the earliest and most obvious candidates. The diffusion of this was very rapid, as we can see from the image that contrasts the 2005 vs. 2013 Papal inaugurations in Vatican City. Before long, the cost of an integrated camera trended towards zero, to the extent that there is no mobile device that does not have one. As a result, 2 billion people have digital cameras with them at all times, and stand ready to photograph just about anything they think is important. Suddenly, there are countless cameras at every scene. But lest you think the ubiquity of digital cameras is the end of the story, you are making the same mistake as those who thought color photography on film in 1968 was the end of the road. Remember that the ATOM is never truly done, even after the cost of a technology approaches zero. Digital imaging itself is just the preview, for now we have it generating an ever-expanding pile of an even more valuable raw material : data. Images contain a large volume of data, particularly the data that associates things with each other (the eyes are to be above the nose, for example). Data is one of the two fuels of Artificial Intelligence (the other being inexpensive parallel processing). Despite over a decade of digital images being available on the Internet, only now are there enough of them for AI to draw extensive conclusions from them, and for Google s image search to be a major force in the refinement of Google s Search AI. Most people don t even remember when Google added image search to its capabilities, but now it is hard to imagine life without it. Today, we have immediate access to image search that answers questions in the blink of an eye, and fosters even greater curiosity. In a matter of seconds, you can look up images for mandrill teeth, the rings of Saturn, a transit of Venus over the Sun, the coast of Capri, or the jaws of Carcharocles Megalodon. More searches lead to more precise recommendations, and more images continue to be added. In the past, the accessibility of this information was so limited that the invaluable tangents of curiosity just never formed. Hence, the creation of new knowledge speeds up. The curious can more easily pull ahead of the incurious. Digital imaging is one of the primary transformations that built the Internet age, and is a core pillar of the impending ascent of AI. For this reason, it receives the September 2017 ATOM AotM. Related ATOM Chapters :3. Technological Disruption is Pervasive and Deepening For the August 2017 ATOM AotM, we will bend one of the rules. The rule is that a disruption already has to have begun, and be presently underway. But this time, a conversation in the last month s comments brought forth a vision of a quad-layer disruption that is already in the early stages and will manifest in no more than 15 years time. When fully underway, this disruption will further tighten the screws on government bodies that are far too sclerotic to adapt to the speed of the ATOM. To start, we will list out the progression of each of the four disruptions separately. 1) Batteries are improving quickly, and while electric vehicles are not yet competitive in terms of cost and charging speeds (partly due to the true cost of imported oil not being directly visible to consumers). At the same time, an electric car has far fewer moving parts, and fewer liquids to deal with. By many estimates, an electric car can last 300,000 before significant deterioration occurs, vs. 150,000 for an internal combustion engine car. Now, under the current ownership model, a car is driven only 12,000 miles/year and is parked 90% of the time or more. The second half of an electric vehicle s lifetime (150,001-300,000 miles) would only begin in year 13 and extend until year 25 of ownership, which is not practical. If only there were a way to avoid having the car remain idle 90% of the time, occupying parking spaces. It may take until 2032 for electric cars to compress the cost delta to the extent of being superior to ICE cars in total ownership costs for the early years, which then leads to the dividend available in the later years of the electric car s life. 2) Autonomous vehicles are a very overhyped technology. Stanford University demonstrated an early prototype in 2007. Yet even a decade later, a fully autonomous car that operates without any human involvement, let alone the benefit of having a network of such cars, seems scarcely any closer.Eventually, by about 2032, cars will be fully autonomous, widely adopted, and communicate with each other, greatly increasing driving efficiency through high speeds and far less distance between cars than humans can manage. Uber-like services will cost 60-80% less than they do now, since the earnings of the human driver are no longer an element of cost, and Uber charges just 20-30% of the fare itself. It will be cheaper for almost everyone to take the on-demand service all the time, than to own a car outright or even take the bus. If such a car is driven 20 hours a day, it can in fact accrue 300,000 miles in just 5 years of use. This effectively is the only way that electric cars can be driven all the way up to the 300,000 limit. 3) The displacement of brick and mortar retail by e-commerce has far greater implications for the US than for any other country, given the excessive level of land devoted to retails stores and their parking lots. The most grotesque example of this is in Silicon Valley itself (and to a lesser extent, Los Angeles), where vast retail strip mall parking lots are largely empty, yet are within walking distance of tall, narrow townhouses that cost $1.5M despite taking up footprints of barely 600 sqft each. As the closure of more retail stores progresses, and on-demand car usage reduces the need for so many parking spaces, these vast tracts of land can be diverted for another purpose. In major California metros, the economically and morally sound strategy would be to convert the land into multi-story buildings, preferentially residential. But extreme regulatory hurdles and resistance towards construction of any new housing supply will leave this land as dead capital for as long as the obstructionists can manage. But in the vast open suburbs of the American interior, land is about to go from plentiful to extremely plentiful. If you think yards in the suburbs of interior cities are large, wait until most of their nearby strip malls are available for redevelopment, and the only two choices are either residential land or office buildings (there are more than enough parks and golf courses in those locations already). Areas where population is already flat or declining will have little choice but to build even more, and hope that ultra-low real estate costs can attract businesses (this will be no problem at all if the ATOM-DUES program is implemented by then). This disruption is not nearly as much a factor in any country other than the US and, to a lesser extent, Australia, as other countries did not misallocate so much land to retail (and the associated parking lots) in the first place. 4) This fourth disruption is not as essential to this future as the first three, but is highly desirable, simply due to how overdue the disruption is. It is quite shocking that the least productive industry in the private sector relative to 50 years ago is not education, not medicine, but construction. US construction productivity has fallen over the last 50 years. Not merely failed to rise, mind you, but outright declined in absolute terms. But remember, under ATOM principles, the more overdue a disruption is, and the more artificial the obstructions thwarting it, the more sudden it is when it eventually happens. China is not held back by the factors that have led to the abysmally low productivity in US construction, and when there is so much retail land to repurpose, pressure to revive that dead capital will just become too great, even if that means Chinese construction companies have to come in to provide what the US counterparts cannot. This pressure could be the catalyst of the long overdue construction productivity catchup. This topic warrants a lengthy article of its own, but that is for another day. Hence, the first three factors, and possibly the fourth, combine by 2032 to generate a disruption that will be so comprehensive in the US that the inability of government to change zoning laws and permitting at anything close to the speed of market demand will be greatly exposed.The first disruption, batteries, alone could be an ATOM AotM, but this time, the cumulative disruption from these multiple factors, even if it will take the next 15 years to accomplish, gets the award.Related :The End of Petrotyranny (and Victory)Why I Want(ed) Oil to Hit $120 per BarrelA Future Timeline for AutomobilesA Future Timeline for EnergyWhy $70/Barrel Oil is (was) Good for America Related ATOM Chapters :3. Technological Disruption is Pervasive and Deepening11. Implementation of the ATOM Age for Individuals The ATOM AotM for July 2017 reminds us of the true core principles of the ATOM. Whenever anything becomes too expensive relative to value provided, particularly if done so through artificial government intervention in markets, a technological solution invariably replaces the expensive incumbent. Taxi medallions, particularly in New York City, are just the crudest form of city government corruption. Drunk with its own greed, the city ratcheted up the price of taxi medallions from $200,000 in 2003 to $1M in 2013, which is far faster than even the S P500, let alone inflation. Note how there was no decline at all during the 2008-09 recession. This predatory extraction from consumers, much like high oil prices artificially engineered by OPEC, created a market window that might otherwise have not existed until several years later. This induced the ATOM to address this imbalance sooner than it otherwise might have. and gritty entrepreneurs swiftly founded companies like Uber and Lyft, which provided a dramatically better value for money. As a result, the price of taxi medallions in NYC fell by 80% from the inflated peak. The ATOM was at a sufficiently advanced level for the technological response to be as rapid as it was (unlike with, say, expensive oil in the 1973-81 period, when there was almost no ATOM of macroeconomic significance). Remember that the reduction in cost for a certain ride and demolition of a seemingly intractable government graft obstacle is just the first of several ATOM effects. The second is the security of each driver and passenger being identified before the ride. The third is the volume of data that these millions of rides generate (data being one of the two core fuels of Artificial Intelligence). The fourth is the ability to dynamically adjust to demand spikes (the surge pricing that the economically illiterate malign). The fifth is the possibility of new service capabilities altogether. Recall this excerpt from Chapter 11 of the ATOM : Automobile commuters with good jobs but lengthy commutes have joined Uber-type platforms to take a rider along with them on the commute they have to undertake anyway. The driver earns an extra $200-$400/week (against which an appropriate portion of car and smartphone costs can be applied as deductions) with no incremental input time or cost. Meanwhile, other commuters enjoy having one less car on the road for each such dynamically generated carpooling pair. The key is that a dead commute is now monetized even by corporate-class people, increasing passengers per car and reducing traffic congestion, while replacing dedicated taxicabs. For the macroeconomy, it also creates new VM where none existed before.The creation of an entirely new sub-economy, with entirely new velocity-of-money (VM), is where new real wealth creation is the purest. This effect of these ride-sharing platforms is still in its infancy. When autonomous vehicles replace human drivers, the loss of driver income is matched (indeed exceeded in post-tax terms) by savings to passengers. It does not matter which company ultimately wins (Uber is having some PR problems lately), but rather that the disruption is already irreversible and woven into the fabric of the ATOM and broader society. Maybe Uber and Lyft will just be to transportation services what Data General and Commodore were to computing. The point is, this is a superb example of how the ATOM works, and how the transformation is often multi-layered. I have recently appeared on a couple of television programs. The first was Reference Point with Dave Kocharhook, as a two-part Q A about The ATOM.The next one was FutureTalk TV with Martin Wasserman, that included a 10-minute Q A about The ATOM.Inch-by-inch, we will get there. The world does not have to settle for our current substandard status quo.As always, all media coverage is available here. For May 2017, the award goes in a direction that not many associate with technological disruption. Remember that the ATOM relates to not merely products that themselves have a rapidly improving cost/benefit profile, but also towards technological improvements in products, processes, and services that themselves may not be high-tech. The standard shipping container is just an inert box, and most people rarely ever see one. It is not improving from one year to the next in any meaningful sense. The real innovation was in the process technologies enabled through this standardization, and the immense deflation derived through these technologies. Malcolm McLean, a trucking tycoon, envisioned the idea of standardized container sizes, and generously decided to give his idea away for free rather than patent it and seek profit. After the first experiment was a success, rapid adoption and port standardization followed. As we can see from the table (click to enlarge), the introduction of the shipping container swiftly led to an almost 20-fold increase in unloading rates from 1965 to 1970, an unusually rapid improvement of any productivity metric for such an early era. This increased speed led to larger ships, and this in turn led to larger and fewer ports. From an ATOM perspective, these productivity gains introduced a great deal of deflation in the prices of the goods themselves. A broader range of goods could be traded internationally, leading to many more countries being able to compete for the same export demand. New countries could merely join existing supply chains, rather that build entire industries from scratch. China s entry into international trade could not have been as rapid as it was, without the shipping container, and the advantages it conferred onto large countries over smaller ones, and to low cost production countries over expensive ones. This advantage is ongoing, as countries poorer than China are still in the process of integrating the low-hanging fruits of benefit that the shipping container provides. Despite this introduction having begun almost 50 years ago, the full ATOM effect continues to increase. The precise logistics of the entire container-shipping ecosystem demands more powerful computation, sensors, and other innovations like RFID tags and GPS tracking. Furthermore, supply chains transporting trillions of dollars of goods each year generate a huge amount of data, which for the longest time was not even being utilized. Any large and ever-growing collection of data will attract Artificial Intelligence onto it, and this AI will generate additional productivity gains for participants in the supply chain, and hence price reductions for end-users. Since shipping containers are produced in such volume, there are ideas emerging to use them elsewhere, such as a building block for modular construction, or as simple pop-in swimming pool enclosures. For this reason, the shipping container, an inert metal box that transformed the entire award, receives the May 2017 ATOM AotM. H/T : Geoman It is time for the ATOM AotM for April 2017, for which we return to an article I wrote way back in 2009. That article is titled The Publishing Disruption , and at the time of writing, we were on the brink of a transformation in content publication as seismic as the invention of the Gutenberg printing press. Since that time, the anticipated sequence of events unfolded as expected. To excerpt from that article, consider how many centuries of background evolution occurred to get us to where we were in 2007 :What a unique thing a book is. Made from a tree, it has a hundred or more flexible pages that contain written text, enabling the book to contain a large sum of information in a very small volume. Before paper, clay tablets, sheepskin parchment, and papyrus were all used to store information with far less efficiency. Paper itself was once so rare and valuable that the Emperor of China had guards stationed around his paper possessions. Before the invention of the printing press, books were written by hand, and few outside of monastaries knew how to read. There were only a few thousand books in all of Europe in the 14th century. Charlemagne himself took great effort to learn how to read, but never managed to learn how to write, which still put him ahead of most kings of the time, who were generally illiterate. But with the invention of the printing press by Johannes Gutenberg in the mid-15th century, it became possible to make multiple copies of the same book, and before long, the number of books in Europe increased from thousands to millions. But then, note how incredibly low-tech and low-productivity the traditional publishing industry still was well into the 21st century : Fast forward to the early 21st century, and books are still printed by the millions. Longtime readers of The Futurist know that I initially had written a book (2001-02), and sought to have it published the old-fashioned way. However, the publishing industry, and literary agents, were astonishingly low-tech. They did not use email, and required queries to be submitted via regular mail, with a self-addressed, stamped envelope included. So I had to pay postage in both directions, and wait several days for a round trip to hear their response. And this was just the literary agents. The actual publishing house, if they decide to accept your book, would still take 12 months to produce and distribute the book even after the manuscript was complete. Even then, royalties would be 10-15% of the retail price. This prospect did not seem compelling to me, and I chose to parse my book into this blog you see before you.The refusal by the publishing industry to use email and other productivity-enhancing technologies as recently as 2003 kept their wages low. Editors always moaned that they worked 60 hours a week just to make $50,000 a year, the same as they made in 1970. My answer to them is that they have no basis to expect wage increases without increasing their productivity through technology. An industry this far behind was just begging to be disrupted. As we have seen from the ATOM, the more overdue a particular disruption is, the more dramatic and swift the disruption when it eventually occurs, as the distance to travel just to revert to the trendline of that particular innovation, is great. Proceeding further in the original article :The Amazon Kindle launched in late 2007 at the high price of $400. Many people feel that the appeal of holding a physical book in our hands cannot be replaced by a display screen, and take a cavalier attitude towards dismissing e-readers. The tune changes upon learning that the price of a book on an e-reader is just a third of what the paper form at a brick-and-mortar bookstore, with sales tax, would cost. As of 2017, an entry-level Kindle 8 costs just $80 (with 3 GB of storage), yet is far more advanced than the $400 Kindle of 2007 (with just 250 MB of storage). Cumulative Kindle sales are estimated to be over 100 million units now. But the Kindle hardware is not the real disruption, as it is a new purchase imposed on people who needed no such device to read paper books. The real ATOM disruption is in books themselves. Now, an author can publish directly on Kindle, and at a $10 sales price, immediately begins to receive a 70% royalty. Contrast that with the 10-15% royalty on a $20 sales price in traditional book publishing, that too after a 12-month waiting period even after the manuscript is complete. While bound books may still make sense for famous authors, the new market created by the Kindle has enabled the publication of many books that only expect to sell 10,000 copies. There is no material involved, so the production and distribution cost of any such publication has literally fallen by a factor of millions. A hefty cost is now no cost, precisely as the ATOM predicts. 2017 is the year where e-book sales have surpassed print and audio book sales, as per the chart. Since the previous article, brick and mortar bookstores have seen a torrent of closures. Borders Bookstores has completely shut down all of its 511 bookstores in the US. Barnes Noble still exists, partly due to capturing the residual Borders revenue, but a growing share of B N s in-store revenue is now from the coffee shop, magazines, and certain specialty book sales. The unshackling of the bottom 99% of authors and aspiring authors from the extreme inefficiency of the traditional publishing industry has unleashed more content than was ever possible before, and is a market upgrade just as significant as that of the Gutenberg press in the 15th century. It is also a perfect demonstration of the accelerating rate of change, for while it took centuries for the diffusion of printed books to manifest, the e-book transformation was in mere years. For this reason, the Amazon Kindle and e-book ecosystem are the winner of April 2017 s ATOM Award of the Month. I need more candidate submissions for future ATOM AotM awards. Related ATOM Chapters :3 : Technological Disruption is Pervasive and Deepening4 : The Overlooked Economics of Technology There is an emerging paradox within the flow of technological diffusion. The paradox is, ironically, that rapid progress of technology has constrained its own ability to progress further. What exactly is the meaning of this? As we see from Chapter 3 of the ATOM, all technological products currently amount to about 2% of GDP. The speed of diffusion is ever faster (see chart), and the average household is taking on an ever-widening range of rapidly advancing products and services. Refer to the section from that chapter, about the number of technologically deflating nodes in the average US household by decade (easily verified by viewing any TV program from that decade), and a poll for readers to declare their own quantity of nodes. To revisit the same thing here :Include : Actively used PCs, LED TVs and monitors, smartphones, tablets, game consoles, VR headsets, digital picture frames, LED light bulbs, home networking devices, laser printers, webcams, DVRs, Kindles, robotic toys, and every external storage device. Count each car as 1 node, even though modern cars may have $4000 of electronics in them.Exclude : Old tube TVs, film cameras, individual software programs and video games, films on storage discs, any miscellaneous item valued at less than $5, or your washer/dryer/oven/clock radio just for having a digital display, as the product is not improving dramatically each year. amp;lt;span 10pt; amp;quot;= amp;quot;10pt; amp;amp;quot; amp;quot; data-mce- >1970s and earlier : 01980s : 1-21990s : 2-42000s : 5-102010s : 12-302020s : 50-1002030s : Hundreds?Herein lies the problem for the average household. The cost to upgrade PCs, smartphones, networking equipment, TVs, storage, and in some cases the entire car, has become expensive. This can often run over $2000/year, and unsurprisingly, upgrades have been slowing. The technology industry is hence a victim of its own success. By releasing products that cause so much deflation and hence low Nominal GDP growth and sluggish job growth, the technology industry has been constricting its own demand base. Amidst all the job-loss through technological automation, the hiring of the tech industry itself is constrained if fewer people can keep buying their products. If the bottom 70-80% of US household income brackets can no longer keep up with technological upgrades, their ability to keep up with new economic opportunities will suffer as well. This is why monetization of technological progress into a dividend is crucial, which is where the ATOM Direct Universal Exponential Stipend (DUES) fits in. It is so much more than a mere basic income , since it is directly indexed to the exact speed to technological progress. As of April 2017, the estimated DUES amount in the US is $500/month (up from $400/month back in February 2016 when the ATOM was first published). A good portion of this cushion enables faster technology upgrades and more new adoption. It is that time of the month again. For our third ever award of the ATOM AotM, we return to an article I wrote over 10 years ago about the lighting revolution. At that time, when the disruption was still in the future, I highlighted how the humble status of the light fixture leads to an associated disruption going widely unnoticed. That continues to be true even today, despite the important product transition that most people have already undertaken. So the ATOM AotM award for March 2017 goes to the LED lightbulb. Something that most people do not even notice is a major engine of the ATOM, as it has introduced rapid price/performance improvements into what used to be a stagnant product.Charge of the Light Brigade : Remember that the average household has about 25 light bulbs on average. From the chart, we can see that light output per until of energy and cost per watt of LED lighting are both improving rapidly, leading to a double-exponential effect. Lighting is hence now fully in the path of the ATOM and is seeing progress at a rate entirely beyond what the predecessor technology could have have experienced, and is indeed one of the fastest technology shifts ever (see the second chart). Bulbs are now purchased in packs of 4-12 units, rather than the single-unit purchases of the recent past. The expected electricity savings worldwide are estimated to be over $100 Billion per year in the near future. The domino effects of this are immense. Follow the sequence of steps below :LED bulbs are reducing the electricity consumed by lighting.This reduction in demand more than accommodates the proliferation of electric cars. The first 100 million electric cars worldwide (a level we are still extremely far from) will merely offset the loss of electricity demand for lighting. The spread of electric cars with no net rise in electricity consumption nonetheless reduces oil consumption and hence oil imports. The US already has a trade surplus with OPEC, for the first time in half a century, and this force is strengthening further. Even if the price per barrel of oil had not fallen through fracking, the number of imported barrels still would have plunged. So even though most lighting is not fueled by oil, it created a puncture point through which a second-degree blow to oil demand arose. That is truly amazing, making LED lighting not just a component of the ATOM but one of the largest disruptions currently underway. That concludes this month s ATOM AotM. I need more reader submissions to ensure we have a good award each month. Related :3. Technological Disruption is Pervasive and Deepening.The Imminent Revolution in Lighting, and Why it is More Important Than You Think When people think of FinTech, they think of a few things like peer-to-peer lending, payment companies, asset management firms, or maybe even cryptocurrencies. But one of the most outdated yet burdensome costs in all of finance, spread across the widest range of people, is still overlooked. The mortgage lending process is heavily padded with fees that are remnants of a bygone age.Enter the ATOM. First, we must begin with the effect of technology on short-term interest rates. The Fed Funds rate was close to zero for several years, and it is apparent that any brief increase in rates by the Federal Reserve will swiftly be reversed once markets punish the move in subsequent months. We are in an age of accelerating and exponential technological deflation, and not only will the Fed Funds rate have to be zero forever, but money-printing will be needed to offset deflation. This process has already been underway for years, and is not yet recognized as part of the long term trend of technological progress. A 30-year mortgage was the standard format for decades, with a variable-rate mortgage seen as risky after a borrower locks in a low rate on their 30-year mortgage. But when the Fed Funds rate was at nearly zero, the LIBOR (London Interbank Offer Rate) hovered around 0.18% or so. If you get a variable-rate mortgage, then the rate is calculated off of the LIBOR, with an additional premium levied by the lending institution. This premium is about 1.5% or more. When the LIBOR rate was over 3% not too many years ago, the lender premium was only a third of the mortgage, but now, it is 85-90% of the mortgage. So instead of paying 0.18%, the lender pays 1.7%. This huge buffer represents one of the most attractive areas for FinTech to disrupt, as what was once a secondary cost is now the overwhelmingly dominant padding, itself a remnant of a bygone age. When almost 90% of the interest charged in a mortgage merely represents the value that the lending institution provides, we can examine the components of this and see which of those could be replaced with a lower cost technological alternative. The lender, such as a major bank, provides a brand name, a mortgage officer to meet with face-to-face, and other such provisions. All of this is either unnecessary, or can be provided at much lower cost with the latest technologies. For example, blockchains can ensure the security aspects of the mortgage transaction are robust. Online consumer review services can provide an extra layer of reputational buttressing to any innovative new lending platform. The rationale for such a hefty mortgage markup over the underlying interest rate is just no longer there. If the lender premium in a mortgage falls from 85-90% down to, say, 50%, then the rate on an adjustable rate mortgage will decline to just twice the LIBOR, or about 0.4%. Even thought the Federal Reserve has recently increased the Fed Funds rate, this is very temporary, and 0% will be the Fed Funds rate for the majority of the forseeable future, just as it has been for the last 9 years. When this sort of ATOM-derived cost savings on interest payments percolates through the economy, it will cause a series of disruptions that will greatly reduce one of the last main consumer expenditures not yet being attacked by technology. Housing costs have risen above the inflation rate in many major cities, against the grain of technology. This is unnatural, since a home does not spontaneously renovate itself, get bigger, or otherwise increase in inherent value. On the contrary, the materials deteriorate over time, so the value should fall. Yet, home prices rise despite these structural forces, due to artificial decisions to restrict supply, lower bond yields through QE, etc. This artificial propping up of home prices masks the excessive costs in the industry, particularly in the mortgage-lending sector. As Fintech irons out the aforementioned outdated expenses in the mortgage-lending process, many fundamental assumptions about home ownership will change. Home ownership is a very emotional concept for many buyers (which is why there is a widespread misconception that a person owns their home even while they are making mortgage payments on it, when in reality, ownership is achieved only when the mortgage is fully paid off). This emotion obscures the high costs of obsolete products and procedures that continue to reside in the mortgage industry. Amidst all the technological disruptions we have seen within the last generation, most people still don t understand that the central origin of most disruptions is an outdated, expensive incumbent system. But the FinTech wing of the ATOM has started the cracks in the dam process against a very substantial and widely-levied cost, and this may be the disruption that brings FinTech s dividends to the masses. After the inaugural award in January, a new month brings a new ATOM AotM. This time, we go to an entirely different sector than we examined last time. The award for this month goes to the collaboration between the Georgia Institute of Technology, Udacity, and AT T to provide a fully accredited Masters of Science in Computer Science degree, for the very low price of $6700 on average. The disruption in education is a topic I have written about at length. In essence, most education is just a transmission of commoditized information, that, like every other information technology, should be declining in cost. However, the corrupt education industry has managed to burrow deep into the emotions of its customers, to such an extent that a rising price for a product of stagnant (often declining) quality is not even questioned. For this reason, education is in a bubble that is already in the process of deflating. What the MSCS at GATech accomplishes is four-fold :Lowering the cost of the degree by almost an order of magnitude compared to the same degree as similarly-ranked schoolsMaking the degree available without relocation to where the institution is physically locatedScaling the degree to an eventual intake of 10,000 students, vs. just 300 that can attend a traditional in-residence program at GATechEstablishing best practices for other departments at GATech, and other institutions, to implement in order to create a broader array of MOOC degree programsAfter a slow start, enrollment now is reported to be over 3300 students, representing a significant fraction of students presently studying MS-level computer science at equal or higher ranked schools. The only reason enrollment has not risen all the way up to the full 10,000 is due to insufficient resourcefulness in shopping around and implementing ATOM principles to greatly increase one s living standards through ATOM means. Aside from perhaps the top two schools like MIT and Stanford, there is perhaps no greater value for money than the GATech MSCS, which will become apparent as the slower adopters drift towards the program, particularly from overseas. Eventually, the sheer size of enrollment will rapidly lead to GATech becoming a dominant alumni community within computer science, forcing other institutions to catch up. When this competition lowers costs even further, we will see one of the most highly paid and future-proof professions being accessible at little or no cost. When contrasted to the immense costs of attending medical or law school, many borderline students will pursue computer science ahead of professions with large student debt burdens, creating a self-reinforcing cycle of ever-more computer science and ATOM propagation. The fact that one can enroll in the program from overseas will attract many students from countries that do not even have schools of GATech s caliber (i.e. most countries), generating local talent despite remote education. Crucially, this is strong evidence of how the ATOM always finds new ways to expand itself, since the field most essential to the feeding of the ATOM, computer science, is the one that found a way to greatly increase the number of people destined to work in it, by attacking both cost thresholds and enrollment volumes. This is not a coincidence, because the ATOM always finds a way around anything that is inhibiting the growth of the ATOM, in this case, access to computer science training. Subsequent to this, the ATOM can increase the productivity of education even in less ATOM-crucial fields medicine, law, business, and K-12, since the greatly expanded size of the computer science profession will provide entrepreneurs and expertise to make this happen. This is how the ATOM captures an ever-growing share of the economy into rapidly-deflating technological fundamentals. As always, the ATOM AotM succeeds through reader suggestions, so feel free to suggest candidates. Criteria include the size and scope of the disruption, how anti-technology the disrupted incumbent was, and an obvious improvement in the quality of a large number of lives through this disruption. Related :The Education Disruption : 201511. Implementation of the ATOM Age for Individuals With the new year, we are starting a new article series here at The Futurist. The theme will be a recognition of exceptional innovation. Candidates can be any industry, corporation, or individual that has created an innovation exemplifying the very best of technological disruption. The more ATOM principles exhibited in an innovation (rising living standards, deflation acting in proportion to prior inflation in the incumbent industry, rapid continuous technological improvement, etc.), the greater the chance of qualification.The inaugural winner of the ATOM Award of the Month is the US hydraulic fracturing industry. While fracking garnered the most news in 2011-13, the rapid technological improvements continued. Natural gas continues to hover around just $3, making the US one of the most competitive countries in industries where natural gas is a large input. Oil prices continue to fall due to ever-improving efficiencies, and from the chart, we can see how many of the largest fields have seen breakevens fall from $80 to under $40 in just the brief 2013-16 period. This is of profound importance, because now even $50 is a profitable price for US shale oil. There is no indication that this trend of lower breakeven prices has stopped. Keep in mind that the massive shale formations in California are not even being accessed yet due to radical obstruction, but a breakeven of $30 or lower ensure the pressure to extract this profit from the Monterrey shale continues to rise. Beyond that, Canada has not yet begun fracking of its own, and when it does, it will certainly have at least as much additional oil as the US found. This increase, which is just an extra 3M barrels/day to US supply, was nonetheless enough to capsize this highly elastic market and crash world oil prices from $100+ to about $50. Given the improving breakevens, and possibility of new production, this will continue to pressure oil prices for the foreseeable future. This has led to the US turning the tables on OPEC and reversing a large trade deficit into what is now a surplus. If you told any of those peak oil Malthusians that the US would soon have a trade surplus with OPEC, they would have branded you as a lunatic. Note how that ill-informed Maoist-Malthusian cult utterly vanished. Furthermore, this plunge in oil prices has strengthened the economies of other countries that import most of their oil, from Japan to India. Under ATOM principles, technology always finds a way to lower the cost of something that has become artificially expensive and is hence obstructing the advancement of other technologies. Oil was a premier example of this, as almost all technological innovation is done in countries that have to import large portions of their oil, while almost none is done by oil exporters. Excess wealth accumulation by oil exporters was an anti-technology impediment, and demanded the attention of a good portion of the ATOM. Remember that the worldwide ATOM is of an ever rising size, and comprises of the sum total of all technological products in production at a given time (currently, about 2% of world GDP). Hence, all technological disruptions are interconnected, and when the ATOM is freed up from the completion of a certain disruption, that amount of disruptive capacity becomes available to tackle something new. Given the size of this disruption to oil prices and production geography, this occupied a large portion of the ATOM for a few years, which means a lot of ATOM capacity is now free to act elsewhere.This disruption was also one of the most famous predictions of mine here at The Futurist. In 2011, I predicted that high oil prices was effectively a form of burning a candle at both ends and such prices were jolting at least six compensating technologies into overdrive. I provided an equation predicting when oil would topple, and it toppled well in accordance with that prediction (even sooner than the equation estimated). This concludes our very first ATOM AotM to kick off the new year. I need candidate submissions from readers in order to get a good pool to select from. Criteria include the size and scope of the disruption, how anti-technology the disrupted incumbent was, and an obvious improvement in the quality of a large number of lives through this disruption. I came across some recent charts about the growth of these two unrelated sectors, one disrupting manufacturing, the other disrupting software of all types (click to enlarge). On one hand, each chart commits the common error of portraying smooth parabola growth, with no range of outcomes in the event of a recession (which will surely happen well within the 8-year timelines portrayed, most likely as soon as 2017). On the other hand, these charts provide reason to be excited about the speed of progress seen in these two highly disruptive technologies, which are core pillars of the ATOM. This sort of growth rate across two quite unrelated sectors, while present in many prior disruptions, is often not noticed by most people, including those working in these particular fields. Remember, until recently, it took decades or even centuries to have disruptions of this scale, but now we see the same magnitude of transformation happen in mere years, and in many pockets of the economy. This supports the case that all technological disruptions are interconnected and the aggregate size of all disruptions can be calculated, which is a core tenet of the ATOM. Related :3. Technological Disruption is Pervasive and Deepening I have recently come into contact with a few professionals in transition, many from the now-shrinking big semiconductor companies. In speaking to them, one thing that stood out is how it takes them 9-12 months or more to secure a new position. Why is this the case, in an age of accelerating technological progress, as per the ATOM? This is an instance of where culture has prevented the adoption of a solution that is technologically feasible. Where Cultural Inertia Obstructs Technology : Before the Internet age, if you wanted to research a subject, you had to go to the library, spend hours there, check out some books, and go back home. Overall, this consumed half a day, and could only be conducted during the library s hours of operation. If the books did not have all the information you needed, you had to repeat this process. Even this was available only in the dozen or so countries that have good public libraries in the first place. But now, in the Internet age, the same research can be conducted in mere minutes, from any location. The precision of Google and other search engines continues to improve, and with deep learning, many improvements are self-propagating. There is a 10x to 30x increase in the productivity of searching for information. If you feel that this example is imprecise, take the case of LinkedIn. It has enabled many aspects of career research and networking that were just not possible before. If a young person wishes to explore dozens of career paths and estimate common patterns, the utility of a certain degree, or the probability of reaching a certain title, LinkedIn has an endless supply of information and people you can identify and communicate with. Yet despite all of this, job searches are just as lengthy as in the days before the Internet, LinkedIn, and other resources. If a candidate can match with three potential jobs in their search region at any given time, then the connection between employer and candidate should take mere weeks, not close to a year. There is no other widespread transaction within society that takes anywhere near as long. Despite new apps to organize the job search and new social media outlets that announce endless meetups and networking events, technology has clearly failed to generate any productivity gains in this process. For one thing, the Internet has reduced the marginal cost of an application to so little that each position receives hundreds of candidates, unlike three or four back when paper resumes had to be sent via the US Postal Service. To cope with this, employers use software that searches resumes for keywords. This method selects for certain types of resumes, with keyword optimization superceding more descriptive elements of the resume, and filtering out many suitable candidates in favor of those who know how to game the keyword algorithm. From this point, a desire to mitigate hiring risk combined with the lack of imagination inherent to most corporations defaults into a practice of increasing the number of interviewers that the candidate faces. Three rounds and a dozen interviews is not uncommon, but by most accounts, job interviews are nearly useless as predictors of performance. In reality, a candidate only needs to be interviewed by three people : the hiring manager, the manager above that, and one lateral peer. If these three people cannot make an accurate assessment, adding several other interviewers is not going to add additional value. Indeed, if the boss s boss cannot make accurate assessment of candidates, then they are failing at the primary skill that an executive is supposed to have. Reference checks are also a peculiar ritual, as a candidate will only submit favorably disposed references who have been contacted beforehand. Modernizing Hiring For the Information Age : Matching openings with candidates should not be so tedious in this age of search engines, emailed resumes, and LinkedIn. Resistance to change and a miscalculation of risk and opportunity cost are the human obstacles standing athwart favorable evolution. To correct this obsolete situation, consider the mismanagement that occurs at the source. Only after a hiring manager sees a persistent and pronounced need for additional personnel does the process of getting a requisition approved and advertised commence. Hence, the job begins to receive resumes only several months after the need for a new hire arose. After that point, the lengthy selection and interviewing process takes months more. Instead, what if the Data Analytics of a corporate setting could be gathered, mined, and processed, so that the AI identifies a cluster of gaps within the existing team, and identifies suitable candidates from LinkedIn? Candidates with the correct skillset could be identified with a compatibility score such as 86% fit , 92% fit , and so on. The entire process from the starting point of where a team begins to find itself understaffed to when a candidate deemed to be an acceptable fit is hired, can compress from over a year to mere weeks. The hefty fees charged by recruiters vanish, and the shorter duration of unemployment reduces all the indirect costs of extended unemployment. For this level of dynamic assessment of gaps and subsequent candidate mapping, the capability of search and data analytics within a corporation has to evolve to a far more advanced state than presently exists. Emails, performance reviews, and project schedules, etc. all have to be searchable across the same search and patterning capabilities. Then, this has to interface with LinkedIn, which itself has to become far more advanced with the capability for a candidate to continuously re-verify skills and prove certain competencies (through tests, certified courses, etc.). The platform has had no real improvement in capabilities in the last few years, and the obvious next step - generating a complex set of skill parameters for LinkedIn members, and matching that pattern to employers with similar needs, is quite overdue. If this seems like added work for candidates, remember that this effort is far less than the amount of time and hassle it will save in the job search process. Of course, such a capability across LinkedIn and some pattern matching machine learning engine will not be adopted overnight. After all, corporations still think university degrees and school rank are good indicators of candidate job performance, despite both evidence and common sense. After that, the interface between some internal corporate software and LinkedIn will take a lot of work to become robust. Finally, the belief that a greater number of interviews somehow reduces the risk of hiring a candidate is a belief that will be difficult to purge. But eventually, with technology companies leading the way, the massive hidden cost of current hiring practices may come to light, and give way to a system that uses AI to find more precise matches with much greater speed. Conclusion : We now possess the machine learning capabilities to dynamically detect gaps within corporate teams and organizational structures that may be large enough to warrant an increase in headcount. These gaps can be matched with parameters that can be mined from LinkedIn profiles, and provide candidates with an assessment of their approximate fit. A percentage score calculated for each candidate is not only a more accurate indicator than the very imprecise interview process, but is far quicker as well. It is high time that these tools were created by LinkedIn and others, and that corporate culture shifted towards their adoption. This application of AI is the second most necessary technological disruption that AI can deliver to our civilization at present. For the first, check back for the next article. I do not have the time to pursue a company built around this type of machine learning product, but if someone else is inspired to take up this challenge, I would certainly like to be on your board of directors. Related ATOM Chapters : 11. Implementation of the ATOM age for Individuals The recent FOMC meetings continue to feature a range of debate only around the rate at which the Fed Funds rate can be increased up to about 4% (which has not coincided with a robust economy since the late 1990s). They actually describe this as a normal rate, and the process of raising the rate as normalization . The Dot Plot pictured here indicates the paradigm that the Federal Reserve still believes. Even the most dovish members still think that the Fed Funds rate will be above 2% by 2019. This is dangerously inaccurate. At the start of 2016, the Federal Reserve expected that they will do four rate likes this year itself. Now they are down to an expectation of just two (one more than the one early in this year), and may just halt with one. How can a collection of supposedly the best and wisest economic forecasters be so consistently wrong? A 20% stock market correction will lead to a swift rate reversal and a 25%+ correction will lead to a resumption of QE in excess of $100B/month. As we can see in the ATOM e-book, technological deflation is endless and exponentially increasing, and hence the Wu-Xia shadow rate indicates the natural Fed Funds rate for the US to be around the equivalent -2%. Yes, minus two percent, achieved through the various rounds of QE that have been done to date in order to simulate a negative interest rate. The US stopped its QE in 2014, but continues to be held afloat by a portion of the $220B/month of worldwide central bank easing that flows into the US. This is barely enough to keep US Nominal GDP (NGDP) growth at 3%, which is far below the level at which innovation can proceed at its trendline rate. The connection between technological progress, technological deflation, and worldwide central bank action is still not being discovered by decision-makers. The -2% indicated by the Wu-Xia shadow rate might be as deep as -4% by 2025, under current trends of technological diffusion. The worldwide central bank easing required to halt deflation by that time will be several times higher than today. As per the ATOM policy reform recommendations, this can be an exceptionally favorable thing if the fundamentals are recognized. For the full analysis and thesis, read the ATOM e-book. Related ATOM Chapters :4. The Overlooked Economics of Technology6. Current Government Policy Will Soon Be Ineffective7. Government Policies Must Adapt, and Quickly10. Implementation of the ATOM Age for Nations In the ATOM e-book, we examine how technological disruption can be measured, and how the aggregate disruption ongoing in the world at any given time continues along a smooth, exponentially rising trendline. Among these, certain disruptions are invisible to most onlookers, because a tangential technology is simultaneously disrupting seemingly unrelated industries from an orthogonal direction. In that vein, here are two separate lists of industries that are being disrupted, one by Deep Learning and the other by Blockchain. 13 Industries Using Deep Learning to Innovate. 20 Industries that Blockchain could Disrupt Note how many industries are present in both of the above lists, meaning that the sectors have to deal with compound disruptions from more than one direction. In addition, we see that sectors where disruption was artificially thwarted due to excessive regulation and government protectionism merely see a sharper disruption, higher up in the edifice. When the disruption arrives through abstract technologies such as Deep Learning and Blockchain, the incumbents are unlikely to be able to thwart it, due to the source of the disruption being effectively invisible to the untrained eye. What is understood by very few is that the accelerating rate of adoption/diffusion, depicted in this chart here from Blackrock, is enabled by such orthogonal forces that are not tied to any one product category or even industry. Related ATOM Chapters :Technological Disruption is Pervasive and DeepeningThe Overlooked Economics of Technology A number of new telescopes are soon going to be entered into service, all of which are far more powerful than equivalent predecessors. This is fully expected by any longtime reader of The Futurist, for space-related articles have been a favorite theme here. To begin, refer to the vintage 2006 article where I estimated telescope power to be rising at a compound annual rate of approximately 26%/year, although that is a trendline of a staircase with very large steps. This, coincidentally, is exactly the same rate at which computer graphics technology advances, which also happens to be the square root of Moore s Law s rate of progress. According to this timeline, a wave of powerful telescopes arriving now happens to be right on schedule. Secondly, refer to one of the very best articles on The Futurist, titled SETI and the Singularity , where the impact of increasing telescopic power is examined. The exponential increase in the detection of exoplanets (chart from Wikipedia), and the implications for the Drake Equation, are measured, with a major prediction about extraterrestrial life contained therein. Building on that, in the ATOM e-book, I detail how accelerating technological progress has a major impact on space exploration. Contrary to a widely-repeated belief that space exploration has plateaued since the Apollo program, technology has ensure that quite the opposite is true. Exoplanet detection is now in the hundreds per year (and soon to be in the thousands), even as technologies such as 3D Printing in space and asteroid mining are poised to generate great wealth here on Earth. With space innovation no longer exclusively the domain of the US, costs have lowered through competition. India has launched a successful Mars orbiter at 1/10th the cost of equivalent US or Russian programs, which has been in operation for two years. Related ATOM Chapters :3. Technological Disruption is Pervasive and Deepening12. The ATOM s Effect on the Final Frontier The best news of the last month was something that most people entirely missed. Amidst all the distractions and noise that comprises modern media, a quiet press release discloses that a supercomputer has suddenly become more effective than human doctors in diagnosing certain types of ailments. IBM s Watson correctly diagnoses a patient after doctors are stumped.This is exceptionally important. As previously detailed in Chapter 3 of The ATOM, not only was a machine more competent than an entire group of physicians, but the machine continues to improve as more patients use it, which in turn makes it more attractive to use, which enables the accrual of even more data upon which to improve further. But most importantly, a supercomputer like Watson can treat patients in hundreds of locations in the same day via a network connection, and without appointments that have to be made weeks in advance. Hence, such a machine replaces not one, but hundreds of doctors. Furthermore, it takes very little time to produce more Watsons, but it takes 30+ years to produce a doctor from birth, among the small fraction of humans with the intellectual ability to even become a physician. The economies of scale relative to the present doctor-patient model are simply astonishing, and there is no reason that 60-80% of diagnostic work done by physicians cannot soon be replaced by artificial intelligence. This does not mean that physicians will start facing mass unemployment, but rather than the best among them will be able to focus on more challenging problems. The most business-minded of physicians can incorporate AI into their practice to see a greater volume of patients on more complicated ailments. This is yet another manifestation of various ATOM principles, from technologies endlessly crushing the cost of anything overpriced, to self-reinforcing improvement of deep learning. Related : Eight paraplegics take their first step in years, thanks to robotics. Related ATOM Chapters :3. Technological Disruption is Pervasive and Deepening4. The Overlooked Economics of Technology The US 10-Year Treasury Yield is now just 1.55% and dropping. Despite US QE (which has not been at a full $85B/month since 2013), this is evidence of future expectations of very little inflation. My research for the ATOM has revealed that pervasive technological disruption is the reason for this structurally declining inflation. Failure to recognize that this technological force is permanent and rising exponentially is the reason experts are baffled as to where all the central bank money is going, and reforms towards a revamp of central bank monetary easing is not being discussed. Remember that this fall in yield is across all countries with significant technology density. German and Japanese 10-year bond yields are almost 0%. US Real Estate received a decades-long boost from lowering mortgage rates as long-term bond yields fell. Few question how homes that used to be 2.5 times the household income of the area are now priced at 10 times the household income or higher. With yields getting this low, and with property taxes now as large a contributor to home ownership costs as the mortgage payment, how much higher can home prices go? How much of US GDP is dependent on not merely high, but rising home prices? Related ATOM chapters :3. Technological Disruption is Pervasive and Deepening4. The Overlooked Economics of Technology6. Current Government Policy Will Soon be Ineffective MIT Technology Review has an article describing how Tesla Motors has brought rapid disruption to the previously staid auto industry, where there are too many factors precluding the entry of new companies. But this is nothing new for readers of The Futurist, as I specifically identified Tesla as a key candidate for disruption way back in 2006. In Venture Capital terms, this was an exceptionally good pick such an early stage. In ATOM terms, the progress of Tesla is an example of everything from how all technological disruptions are interlinked, to how each disruption is deflationary in nature. It is not just about the early progress towards electric cars, removal of the dealership layer of distribution, or the recent erratic progress of semi-autonomous driving. Among other things, Tesla has introduced lower-key but huge innovations such as remote wireless software upgrades of the customer fleet, which itself is a paradigm shift towards rapidly-iterating product improvement. In true ATOM form, the accelerating rate of technological change is beginning to sweep the automobile along with it. When Tesla eventually manages to release a sub-$35,000 vehicle, the precedents set in dealership displacement, continual wireless upgrades, and semi-autonomous driving will suddenly all be available across hundreds of thousands of cars, surprising unprepared observers but proceeding precisely along the expected ATOM trajectory. Chapter 2 of the ATOM e-book addresses the centuries-old accelerating trendline of economic growth. Recall that this was the topic of an article of mine almost exactly 9 years ago as well. However, there may be more nuances to this concept than previously addressed. It may be that since GDP is a human construct, it only happens to be correlated to the accelerating rate of change by virtue of humans being the forefront of advancing intelligence. It could be that once artificial intelligence can advance without human assistance, most types of technology that improve human living standards may stagnate, since the grand goal of propagating AI into space is no longer bottlenecked by human progress. Humans are certainly not the final state of evolution, as evidenced by the much greater suitability of AI for space exploration (AI does not require air or water, etc.). That is certainly something to think about. Human progress may only be on an accelerating curve until a handoff to AI is completed. After that, metrics quite different than GDP may be the best to measure progress, as the AI perhaps only cares about computational density, TERAFLOPs, etc. I am unveiling The ATOM , a 14-chapter e-book that contains novel concepts, research, and policy prescriptions about the various effects of technological progress on the economy and society. You can go over to the e-book and start reading and commenting there. Blog posts on The Futurist will now be related to ATOM-affiliated concepts. The e-book is published in blog format, so that comments can accrue underneath each chapter, and future blog posts can link to specific parts of the e-book. Videos at the start of each chapter serve as summaries for those who do not wish to read a wall of text in order to get a synopsis. Go over there and read it, chapter by chapter, up and down. You will never see the world quite the same way again. It is the end of an era here at The Futurist. As long-time readers know, this has been a blog of two individual bloggers who did not distinguish themselves from each other. This was a worthwhile experiment at the inception of the website in early 2006. But now, the goals have changed. The technology blogger will be the primary blogger here, taking a slightly different direction for this site. The political blogger has, for the most part, retired from blogging, and has discontinued participation in some of his other online communities (such as anti-misandry). The technology blogger has been working on a related project of much more comprehensive scope, and will be linking it here to The Futurist. Hence, this blog will be primarily devoted to technological and economic topics, with very little political content going forward. Both bloggers will write under their real names. From Imran Khan (the political blogger) :Many of my fans from the anti-misandry sphere have wondered why I drifted away from there in 2014. Well, it was a combination of several factors :1) My Work There Was Done : The predictions in The Misandry Bubble were solidified and part of the DNA of the androsphere . Existing bloggers keep up with current events and parse the news through the anti-misandry filter, but my contribution was just comments, after The Misandry Bubble over 6 years ago. Over time, much of the content in the androsphere trends toward repetition. When 2020 arrives, we will do an assessment scorecard of the predictions made a decade prior. 2) Not Enough Activism : Anti-misandry ideas have spread to the mainstream through the effort of some great bloggers (like Dalrock and PM/AFT). But too many of the other participants do far too little beyond Internet commenting. I mentioned this in The Misandry Bubble about Why There is no Men s Rights Movement , and this continues to be true. I even invented a strategy and campaign uniquely tailored to operate within the constraints of anonymity, cost, and decentralization that were needed for any real Men s Rights Activism, but only half a dozen men did the legwork despite everyone hailing it as highly effective. To this day, there is minimal activism beyond about five key people, while far smaller causes immediately manage to get the apparatus of activism established. 3) Too Much Infighting : The blogs I commented at do an admirable job of attracting and keeping civil commenters. But elsewhere, some major figure in the androsphere is in an acrimonious battle with another almost every month. The reasons are usually just poor communication between the two parties. For a community of just 300 or so active participants that is up against an evil that outnumbers and outspends them by a ratio of several million to one, there is too much wasteful infighting among people who agree on 90% of their views. Such a movement makes little real movement.4) Too Much Anger Towards Average Women : As The Misandry Bubble states, the hierarchy of misandric zeal is Hardcore Feminist Mangina Average Woman. The average woman does not seethe with a desire to harm any and all males the way a full-time feminist does. The average woman just wants to side with whoever is winning, which is an evolutionary mechanism that helped women survive. I always maintained that ordinary women were being harmed by feminism just as much as ordinary men, since what average women value most has been taken away from them by feminists . As I have maintained, it is impossible to harm one gender without harming the other. Some parts of the sphere have too much anger towards average women, and too little towards the sleazy men who think groveling to feminists will improve their social status. These manginas are universally hated by normal men, normal women, and even hardcore feminists , yet do most of the heavy-lifting that keeps the hate-cult going. If there is an Achilles heel that can be attacked to bring the edifice down, it is these manginas. The inability of Men s Rights to focus on the weakest target ensures a lack of progress. Speaking of manginas....5) Too Many Neo-Nazis : The androsphere has been infested by a strain of Neo-Nazis (describing themselves as white nationalists ) who are both racial supremacists and economic leftists. Their views are wrong on both of those counts, but that is not even the worst thing about them. They are antithetical to the notion of Men s Rights since they believe that a woman of their race is far more valuable than a man, to the extent of being a goddess. It is apparent that any ethno-centric ideology will default to an obsession with fertility rates, and since women are the scarcer reproductive resource, such ideologies invariably become nothing more than fertility goddess cults. This is true, of course, of any ethno-centric ideology, not just whites. In fact, it is a testament to white maturity that white nationalists never get any real traction among their own population. This is precisely why whites are successful - they do a better job marginalizing their own degenerates than other groups do. My debates with the Neo-Nazis were funny. I would routinely point out that 90% of American whites are just not racist, and they would counter that they indeed are, contrary to my observation. In other words, I would insist that their group is not racist, which they see as a bad thing (as it explains their poor recruitment), leading them to insist that more whites are. And I am not just brown, but a Muslim too. Remember that white nationalism recruits only the least successful white men, almost entirely due to their desire to obstruct a white woman s choice to date outside her race. This is leftist protectionism demanded by uncompetitive actors, nothing more. The coup de grace I apply in such debates is to point out that there is almost zero female participation in white nationalism , despite it being an ideology wholly dependent on white women having more babies. The hilarity of an ideology built around higher reproduction nonetheless finding itself to be 98-99% male is self-evident. Women have a natural radar that steers them away from loserdom, and manginas (whether general or Neo-Nazi) always create this effect. Since the ideologies of Neo-Nazis and feminists have substantial overlap, Men s Rights cannot advance without a purge of these Neo-Nazis. Over time, this purge will happen, but my time is better spent elsewhere. 6) The Futurist has a Different Destiny : My technology co-blogger has created something of grand purpose, something so profound that it has a higher significance. It is valuable enough that this website should be devoted exclusively to it, without tangential distractions, given that he is more of a political moderate than I. For these reasons, my participation in the androsphere has drawn to a close. The Misandry Bubble will remain where it is, but it should be seen as a time capsule of predictions, to be opened 10 years hence from original publication. A decade ago, in the early days of this blog, we had an article tracking video game graphics at 10-year intervals. As per that cadence, it is time to add the next entry to the progression. The polygons in any graphical engine increase as a square root of Moore s Law, so the number of polygons doubles every three years. Sometimes, pictures are worth thousands of words :1976 : 1986 :1996 :2006 :I distinctly remember when the 2006 image looked particularly impressive. But now, it no longer does. This inevitably brings us to...2016 (an entire video is available, with some gameplay footage) : This series illustrates how progress, while not visible over one or two years, accumulates to much more over longer periods of time. Now, extrapolating this trajectory of exponential progress, what will games bring us in 2020? or 2026? Additionally, note that screen sizes, screen resolution, and immersion (e.g. VR goggles) have risen simultaneously. I refer readers back to an article written here in 2011, titled The End of Petrotyranny , where I claimed that high oil prices were rapidly burning through the buffer that was shielding oil from technological disruption. I quantified the buffer in an equation, and even provided a point value to how much of the buffer was still remaining at the time.I am happy to declare a precise victory for this prediction, with oil prices having fallen by two-thirds and remaining there for well over a year. While hydraulic fracturing (fracking) turned out to be the primary technology to bring down the OPEC fortress, other technologies such as photovoltaics, batteries, and nanomaterials contributed secondary pressure to the disruption. The disruption unfolded in accordance with the 2011 Law of Finite Petrotyranny :From the start of 2011, measure the dollar-years of area enclosed by a chart of the price of oil above $70. There are only 200 such dollar-years remaining for the current world petro-order. We can call this the Law of Finite Petrotyranny . Go to the original article to see various scenarios of how the dollar-years could have been depleted. While we have not used up the full 200 dollar-years to date, the range of scenarios is now much tighter, particularly since fracking in the US continues to lower its breakeven threshold. At present, over $2T/year that was flowing from oil importers to oil producers, has now vanished, to the immense benefit of oil importers, which are the nations that conduct virtually all technological innovation. The 2011 article was not the first time this subject of technological pressure rising in proportion to the degree of oil price excess has been addressed here at The Futurist. There were prior articles in 2007, as well as 2006 (twice). As production feverishly scales back, and some of the less central petrostates implode, oil prices will gradually rise back up, generally saturating at the $70 level (itself predicted in 2006) in order to deplete the remaining dollar-years. But we may never again see oil at such a high price relative to world GDP, as existed from most of 2007-14 (oil would have to be $200+/barrel today to surpass the record of $147 set in 2008, in proportion to World GDP).

TAGS:The Futurist 

<<< Thank you for your visit >>>

<p><span style="color: #990033;font-size: 1.2em;"><i>"We know what we are, but we know not what we may become"</i></span></p> - William Shakespeare

Websites to related :
Citrus Golf Club

  We Inform our client that our new website is available on this new address www.citrusgolfclub.com You can click on the button below to see our new web

Startseite - Zitrusgarten

  This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish.Accept Read MorePrivacy & Co

Vendo la mejor mandarina y naran

  We are Valencian orange farmers since the beginning of the 20th century. We are great experts in their care and... We are orange and tangerine farmers

Meine Orangerie

  Wir verwenden Cookies, um Ihnen den bestmöglichen Service zu gewährleisten. Mehr hierzu finden Sie in unserer Datenschutzerklärung. Mit dem Verblei

dlp games

  ShipmentShipment to USA is now possible again, but only by sea. Therefore shipping takes 30 days or longer. Regeln der HerbstneuheitenIhr könnt ab so

Ski rental snowboard rental - I

  800 rental stations worldwide Wherever you are Canada, Sweden or Switzerland: The global INTERSPORT Rent network offers you the latest equipment – an

Home | Data Operations Hub

  Welcome to the UC Data Operations Hub! This site is aimed at providing information on the UC Data Warehouse (UCDW) and its components which include th

Planning Budget

  NEW WEBSITECan't find what you need? Email pb-website-group@ucsc.edu

Financial Affairs

  Area  E-mail or Web Address  DescriptionAccounts Payableaphelp@ucsc.eduVendor/Reimbursement Invoices, Payment StatusAt Your Service Online (AYSO)h

Financial Services | Finance Bu

  Finance at UC Davis is an engaged partner in business and financial decisions advancing the university's mission.Our mission is to go beyond deliverin

ads

Hot Websites