I was glad to see this one on Stanford women teaching high school girls to code. Everybody wins with efforts like these to balance the tech world and fight the long-term skew of tech and programming that has traditionally had more men than women.
I had sensed a personal crash coming. For a decade and a half, I’d been a web obsessive, publishing blog posts multiple times a day, seven days a week, and ultimately corralling a team that curated the web every 20 minutes during peak hours.
That’s from Andrew Sullivan: My Distraction Sickness — and Yours, in New York Magazine this week. I recognized the author’s name immediately because I’ve seen Sullivan on talk shows often. On TV he comes off as thoughtful and articulate, and he’s frequently introduced as a gay republican and prolific blogger. Here’s the first paragraph of his Wikipedia biography:
Andrew Michael Sullivan (born 10 August 1963) is an English author, editor, and blogger. Sullivan is a conservative political commentator, a former editor of The New Republic, and the author or editor of six books. He was a pioneer of the political blog, starting his in 2000. He eventually moved his blog to various publishing platforms, including Time, The Atlantic, The Daily Beast, and finally an independent subscription-based format. He announced his retirement from blogging in 2015.
The independent blog mentioned is The Dish, where the last post is dated June of 2015.
But that’s just background information. What’s notable about his background, in this context, is the sudden change, the lack of Andrew Sullivan writing and talking on TV in the last year. I read this piece and discovered why. And decided that what he’s calling distraction sickness might be an epidemic.
Sullivan describes a process that seemed alarmingly familiar to me – and, I bet, to you too:
Facebook soon gave everyone the equivalent of their own blog and their own audience. More and more people got a smartphone — connecting them instantly to a deluge of febrile content, forcing them to cull and absorb and assimilate the online torrent as relentlessly as I had once. Twitter emerged as a form of instant blogging of microthoughts. Users were as addicted to the feedback as I had long been — and even more prolific. Then the apps descended, like the rain, to inundate what was left of our free time. It was ubiquitous now, this virtual living, this never-stopping, this always-updating.
Is that not you? Ok. Nobody you know? C’mon, tell the truth.
I tried reading books, but that skill now began to elude me. After a couple of pages, my fingers twitched for a keyboard. I tried meditation, but my mind bucked and bridled as I tried to still it. I got a steady workout routine, and it gave me the only relief I could measure for an hour or so a day. But over time in this pervasive virtual world, the online clamor grew louder and louder. Although I spent hours each day, alone and silent, attached to a laptop, it felt as if I were in a constant cacophonous crowd of words and images, sounds and ideas, emotions and tirades — a wind tunnel of deafening, deadening noise. So much of it was irresistible, as I fully understood. So much of the technology was irreversible, as I also knew. But I’d begun to fear that this new way of living was actually becoming a way of not-living.
Is this you?
I’m not attempting to duplicate Sullivan’s whole article here. I highly recommend you read it yourself and think about it. But here’s one more piece of it I want to add:
Our oldest human skills atrophy. GPS, for example, is a godsend for finding our way around places we don’t know. But, as Nicholas Carr has noted, it has led to our not even seeing, let alone remembering, the details of our environment, to our not developing the accumulated memories that give us a sense of place and control over what we once called ordinary life. The writer Matthew Crawford has examined how automation and online living have sharply eroded the number of people physically making things, using their own hands and eyes and bodies to craft, say, a wooden chair or a piece of clothing or, in one of Crawford’s more engrossing case studies, a pipe organ.
It certainly made me think about my level of the disease. For a split second. Before diving back into blogging.
Ah yes, the good old days. How quickly time passes. And I can’t help occasionally browsing through technology looking back. My youngest daughter is in her late twenties now. She can barely remember life before cellphones, and can’t remember life before personal computers or VCRs, because both of those were born before she was. I was talking a grandkid the other day, and she couldn’t conceive of a world before amazon.com.
Every so often I get reminded how far we’ve come. When I graduated from college in 1970:
The university had a computer in a basement that took up the space of an SUV and had way less power than an iPhone does now. Computer science students programmed it with perforated cards.
The dorms had one phone per floor. Long distance calling costs were significant. I was in the Midwest, so I’d call my parents in California once every couple of months.
We wrote letters. We read letters.
We used typewriters for every college essay, paper, and assignment. We’d often retype an entire page to correct an error. Sometimes we’d reword things to make the pages end or begin with the correct word so we could insert an additional page.
Four-function calculators existed, but nobody we knew had one. You could have bought a new low-end car for the price of two four-function calculators.
I did my sophomore year abroad, and the university sent us from New York to Europe on an ocean liner. That was cheaper than flying.
We wrote checks when we had to, used cash most of the time, and we got the cash from the bank teller window, not an ATM.
Credit cards were rare. Our parents had them.
Television was broadcast over the air. We watched in real time or not at all. We had 5 or 10 channels to choose from.
When we were driving we listened to the hits on AM radio mostly, or cassette tapes when we could.
And that’s just technology, or a smattering of technology. When I think of social evolution, and environmental deterioration, the end of the cold war, the rise of terrorism, polar ice caps … like we used to say: “far out, man.”
Is this you? Over and over again, you fall off on regular consistent organizational practices like to-do lists, emails, planning, backing up your computer … then you run across some cool new productivity tool. You jump on the bandwagon enthusiastically, promising yourself that you’re finally going to get organized and stay organized. You spend happy hours reorganizing everything to fit the new tool. Then, over time, as the novelty wears off, you end up right back where you started, with the same problems. Cool productivity tools, no productivity.
And then you find a new cool tool and run the same cycle over again.
This is me and productivity tools
I will tell you that this is definitely me. I’ve done this all my life. I veer off to a new organization system like a dumb fish following a shiny new lure in the water. And I see other people doing it too, all the time, all around me. You don’t need a new spreadsheet, or to-do list software, or project planning system; you need to use what you have regularly.
I end up wasting the time it takes to reorganize to the mindset of the cool new tool, repeatedly, instead of managing to follow up on any one thing consistently over a long time.
And what works in the real world is not the tool, not any of the damn tools, but rather the following up. It’s the human behavior that matters, the good habits, consistently applying methods, not getting bored with it, not rationalizing out of it.
I apologize for mixing metaphors with this, but I can’t resist referring to the Rime of the Ancient Mariner, with “Water, water everywhere, and not a drop to drink.” Given the world we live in, computers and the Web, it’s something like: “tools, tools everywhere, and not a drop of productivity.”
Or so it seems.
Now, the question: what are we going to do about it?
Friday video, a TED talk, Girls Who Code founder Reshma Saujani is out to change the way the world looks at girls, tech, and girls in tech. Her non-profit Girls Who Code inspires high school girls to study computer science. She aims to enroll one million women in the program by 2020 — and tech has stepped in to help: Google and Twitter are backers, and engineers at Facebook, AT&T and others have signed on as mentors. Here’s a quote:
Most girls are taught to avoid risk and failure. We’re taught to smile pretty, play it safe, get all A’s. Boys, on the other hand, are taught to play rough, swing high, crawl to the top of the monkey bars and then just jump off headfirst. And by the time they’re adults, whether they’re negotiating a raise or even asking someone out on a date, they’re habituated to take risk after risk. They’re rewarded for it. It’s often said in Silicon Valley, no one even takes you seriously unless you’ve had two failed start-ups. In other words, we’re raising our girls to be perfect, and we’re raising our boys to be brave.
Have you heard the standard cliche: “Necessity is the Mother of Invention?” In business technology and productivity, in my experience at least, the old standard is reversed: the new truth is that Invention is the Mother of Necessity.”
Spreadsheets and Budgeting: When I started in business analysis back in the middle 1970s we didn’t have spreadsheets, and a budget was rarely more than a list of numbers on a yellow pad processed with a calculator and a pen. Then came Visicalc, and shortly after that Lotus 1-2-3 and then Excel. Now, not at all by coincidence, everybody in business does a whole lot more budgeting and spreadsheets than we ever would have imagined back then.So what’s happened is that because spreadsheets made budgeting more accessible, the world started demanding more budgets. To me, this is a good thing. Budgeting is good for business. You could argue, however, that maybe the world of small and medium-sized business was better off when the world summarized budgets into a few key items.Ultimately, in this case, I think it’s obvious that we do more budgets because budgets are easier to do
Desktop publishing and business documents: I’m pretty sure I’ve seen the same thing happen with desktop publishing. Before desktop publishing appeared with the Macintosh and the Apple Laserwriter in the middle 1980s, people put business correspondence onto simple pages printed onto letterhead paper. Nowadays we take desktop publishing tecniques for granted. People routinely merge graphics and text onto simple memos and letters and standard business documents, without thinking twice about it.
Did This Improve Productivity?
That’s an interesting question. Ten years ago I would have been tempted to say no, that it hasn’t improved productivity. More recently I’ve changed my mind. Running a company makes me sure that we benefit from the power of more detailed budgeting, and running through the daily process of management makes me pretty sure that business documents are generally better communicators with desktop publishing than without.
Have you been to a hackathon? Do you know what that is? People get together to play with programming and in a single weekend create something real. It’s an amazing phenomenon.
Tomorrow and Sunday an MIT group is sponsoring one of these (Hack MIT) that’s open to college students from anywhere. And free. Most work in groups of no more than four, some do it alone. They start Saturday morning, work through Saturday night if they want (many do), and by Sunday evening they have something real. It’s an amazing celebration of the magic that is programming.
What do they do? What do they make? The organizers say:
Anything goes! Web, desktop, mobile, and hardware projects are all welcome. (All hacks should be computer-related, though.) Projects will be judged based on creativity, technical difficulty, polish, and usefulness.
I was lucky enough to see the finals of one on these in New York a couple of years ago (HackNY.org). What a kick! They had real stuff to show, as in mobile apps in that case, done in a single weekend, but done well. The prize ceremony was great entertainment.
I love the name “hackathon” as a reminder that “hacking” when I first heard the term was a good thing, that meant making things and solving problems on computers. This is a reminder, what a kick it is to build something that actually is something, and in a single weekend.
As Arthur C. Clarke said:
Any sufficiently advanced technology is indistinguishable from magic.
Interesting post: Amazon Just Beat Apple to the Classroom, on Gizmodo. I’ve been following ebooks and textbooks for more than 10 years now, expecting disruption. Textbooks are obsolete. It should have happened years ago. And there’s a lot going on now, but classrooms are still the same.
In this one, post author Brian Barrett starts by quoting himself from a few months back after Apple presented an iPad solution to classroom learning and textbooks. Brian said then:
Let’s be clear; this is indisputably the future. What we saw today is what our classrooms will look like once iPads are far cheaper, once digital textbooks can be handed down as easily as physical ones, once teachers of every subject have several educational material options to choose among. For now though, it’s important to remember that “new” and “different” always come at a premium. One that the vast majority of us can’t afford.
Brian says that’s as true today as it was then, but …
But look at how Amazon’s offerings have grown since then. A backpack-friendly 7-inch tablet for $160 (and E-ink technology has progressed enough that you could probably make due with a $70 entry-level model). A Kindle eTextbook service that’s ballooned to over 200,000 titles, with generous return policies and cash-saving rental options. And a platform ubiquity that ensures no kid gets left out, regardless of what device he or she owns
And then this, on Whispercast:
But today’s announcement of its Whispercast technology seems to solve problems Apple hadn’t even thought of.
Whispercast is a free service that serves as an umbrella for many, many Kindle management features, but most of all it provides the kind of centralized control over devices that are a luxury for businesses and a necessity for schools. Content distribution, social media and purchase blockades, password protection, document sharing; there couldn’t be a more teacher-friendly checklist.
Sigh … I guess that’s good news. But it’s sad, at least in some ways, that control, blockades, and protection are barriers to better technology in schools. I can see why — lawsuits, fanatics, porn, bullying, and so forth — but still. Damn.
(Complete aside: I like the lead from a writing point of view. Here’s Brian’s first sentence from today’s post:
On a freezing, cloudless day last January in New York, Apple presented to the world its vision for the future of education.
The freezing cloudless day has nothing to do with the rest of the post, but it’s an interesting start.)
Whoops. It suddenly occurred to me: the old Mac-Windows rivalry is dead. There goes a bit of industry history.
It used to be fun, back in the old days, when it mattered. If you’re old enough you’ll remember the famous 1984 Macintosh ad. I was generally forgiven by the Mac zealots for my weakness for Windows, but only because I also used Macs and recognized their superiority. My Mac friends treated my sympathy for Windows systems as a forgivable flaw in my character.
I used to tell this modified version of an old joke:
Somebody dies and goes to heaven. On arrival, St. Peter gives him the quick tour of the place. As they go through heaven from place to place, they look at the mall, the school, the park … and they keep seeing a high wall on one side or the other. Finally, the new arrival can’t resist asking: “What’s with the wall?” St. Peter Answers: “That’s where we keep the Mac users. They like to think they’re the only ones here.”
I like Apple. I consulted with Apple from 1982 to 1994. Apple loaned me an Apple II in 1983 and a Macintosh early in 1984. I wrote the first book laid out on an AppleLaserWriter (at least according to me and McGraw-Hill Microtext, the publisher). As a consultant to Apple, I worried as Windows started to effectively imitate the Mac — not that it was as good, but it was good enough to fool a buyer in a store. And it was personally painful to me when the Windows system so dominated business computing, the late 1990s and early 2000s, that we (temporarily) dropped our Mac business plan product. We really had to. By 2000 a Mac product was costing ten times more than Windows to develop, and its market was about ten times less than Windows. Business is business.
By 2004 my computing was all Windows. And at that point my computing was all Windows. It wasn’t torture. Windows worked. I use a computer to get things done, and Windows did. I may have still preferred Mac, but hey, business is business.
And then the Mac came back. We saw them first in airports, the MacBooks, silently gaining strength and visibility. Then there was the iPhone, and more MacBooks. And then the gorgeous new iMacs. I taught an entrepreneurship class at the University of Oregon from 1998 through 2009. In the beginning all my students had Windows laptops. By the end, 80% of them were on Macs.
Once again, being Mac literate is good business. At Palo Alto Software, our LivePlan SaaS app is browser-based, operating system neutral, and developed mostly on Macs. And Mac software, and the Mac software market, are growth markets again. The app store works. Happy ending.
So now I’m almost all Mac again. I have two iMacs at home, a MacBook air, and iPhone and iPad, and I love it. An old friend. Isn’t computing great? And my Windows 7 desktop, in the office at the company, still works just fine too, thanks. It’s not good and evil, just computing.
Earlier today I posted disruption vs. revenue and the tech bubble on the gust.com blog. I’m suggesting in that post that some special-case web-based startups have to choose between disruption or revenue, because they can’t have both.
That may or may not be true, but I’ve been guilty of suggesting it is to a couple of startup software companies recently. I think both were special cases. They had a real chance to go really big and generate spontaneous buzz based on the product itself. But locking their wares behind pay walls might slow their growth and dampen their success.
That may or may not be true. After the tech crash in 2000, I never thought I’d see that happen again, much less me recommending not covering expenses with revenue. Still, though, there’s Facebook and Twitter and Instagram and, oh my.
When this next bubble pops — and it will pop — the idea to make no money can finally pop, too. Then investors can start working with companies to build businesses that have long-term financial goals, instead of just building a short-term mystery.
But on the same day Chris Dixon (smart person) asked Is It a Tech Bubble on his blog and answered with some convincing analysis, “no.” And the second comment on that post is Fred Wilson of AVC (another smart person) saying:
[Zynga price] certainly doesn’t seem like a bubble valuation either. I do think there is more money sloshing around the tech/internet/mobile sector now than there has ever been. and that is impacting valuations across the board. The question is if this is temporary or the “new normal”. I guess we will find out.
So I’d like to answer this tech bubble question here, but as I was writing this, on Sunday, those other interesting and contradictory posts, from smart people, kept rolling onto the web. I ended up tweeting my conclusion to this post last Sunday, with the following tweet.
Uncertainy is a sign of intelligence. Maybe. I’m not sure.