In this article the author argues for dumb (ie, not smart, IoT) appliances. One problem is that smart appliances are complex and prone to failure. For example, “[t]he last time a repairman came to my house, he told me that he’d had to get a full-time weekday job at Home Depot because nowadays, when appliances break, most people just buy new ones.” He also coined Owen’s Test “Until I can operate my Samsung TV, Blu-Ray player, Amazon Fire TV Stick, and Cisco cable box with a single remote control, the Internet of Things is a hoax.”
Category Archives: Information
Harvard Business School professor: Half of American colleges will be bankrupt in 10 to 15 years
The HBS professor is Clayton Christensenm, whom I hold in high regard. One of his contributions is understanding how technology is disruptive to businesses. The disruptive technology affecting colleges is online education. Basically, he believes “online education will become a more cost-effective way for students to receive an education, effectively undermining the business models of traditional institutions and running them out of business.” See this article.
Re:scam
Read something very enjoyable today. Netsafe built a bot that wastes scammers’ time. Hurrah! Email scamming is a billion dollar business that preys on peoples’ naivety (and to some extent their greed).
Scammers send out millions of emails. Most are deleted or directed to spam. But a few (enough to keep the scammers in business) hit. Scammers personally with these hits trying to get people to reveal sensitive information. Re:scam is a bot that will engage scammers in an endless conversation. Anytime the scammers spend with the bot is time that they are not preying on your grandmother.
I love it.
What To Do When Machines Do Everything
Paul Roehrig gave a thought-provoking keynote address at the HPCC summit last month (video below). He also wrote a book with the same title as this post (I stole it). The high-level take away is the AI will (eventually) benefit all. Similar to the industrial revolution the disruption caused by AI will hurt many industries. (There is a chapter called “There will be blood.”) His talk begins at 8:40 and ends at 29:40 in the video below.
Building Software
Kent Beck is credited with saying “make it work; make it right; make it fast.” This is a complete software design strategy that encompasses many well-understood tenets.
Make it work.
In any but the simplest programs, you don’t really understand what you need or how to provide it until you are finished. Therefore, it doesn’t make sense to (try to) build the “perfect” solution out of the gate. In Cathedral and Bazaar, Eric Raymond told us to plan to throw one implementation away because you will anyway. This is an adaptation of a similar observation by Fred Brooks in the Mythical Man-Month.
This seems like bad advice. It doesn’t make sense in most industries. For example, we don’t build and throw away a house before building the one we want. But in software it does apply for two reasons. First, software is cheap. There is of a labor cost. But there is no in material cost unlike building a home or an airplane.
The second reason we throw the first away is because software is ether. A house is confined to three dimensions and physical materials. There are only so many ways to use a 2×4 and it will always behave within given specifications. But there are almost no restrictions on programming abstractions. The only way to discover the best abstraction is to build it and take it for a drive.
Make it right.
It is only after discovering the proper abstractions can you build the right solution. Waiting until after a working solution has been developed helps to avoid premature generalization. It is very tempting to build complex objects that solve a myriad of problems. However, unless those problems arise in the solution the generalization just adds overhead and complexity and it increases cost to maintain the solution.
Make it fast.
To avoid premature optimization wait until the after the solution is developed to start optimizing the code. It does little good to remove the overhead of an operation that represents less the one percent of the overall computation. I once worked a system that placed pending work in a priority queue. The initial “makeshift” solution was a linked list, so removing required looking at all tasks. Five years later this code remained unoptimized because in practice the list never held more than three tasks.
Summary.
- Make it work: avoid premature generalization because you cannot know want you want/need before building it.
- Make it right: make it work on many use cases before building abstractions.
- Make it fast: avoid premature optimization because not all code is equally critical.