When the mayor of San Francisco found out that Twitter would be leaving the city, he grew worried, and quickly called a meeting with the social network’s chief executive.
Dick Costolo told mayor Ed Lee the city’s tax policies were just too restrictive. If Twitter was to double its employee count from 450 to 1,000 it couldn’t stay in a state that based its taxes on payroll rather than gross receipts.
The incident was a wake-up call, according to a new profile in San Francisco magazine. Now, an initiative to appear on a state ballot this year will change the company’s tax structure to better accommodate the tech industry – and Lee now regularly meets with tech companies at least once a week.
He has met with the heads of companies including Yelp and Yammer to hear from them, to hear about what these companies value the most – after all, San Francisco is in the heart of Silicon Valley.
The biggest and best tech companies in the world are grown in San Francisco. But if Twitter has left, does this mean the city is growing too oversaturated?
As the piece explains, there may be trouble in the tech capital of the world.
The unique urban features that have made San Francisco so appealing to a new generation of digital workers—its artistic ferment, its social diversity, its trailblazing progressive consciousness—are deteriorating, driven out of the city by the tech boom itself, and the rising real estate prices that go with it.
Rents are soaring: Units in one Mission district condominium complex recently sold for a record $900 per square foot. And single-family homes in Noe Valley, Bernal Heights, and other attractive city neighborhoods are selling for as much as 40 percent above the asking price.
Again and again, you hear of teachers, nurses, firefighters, police officers, artists, hotel and restaurant workers, and others with no stake in the new digital gold rush being squeezed out of the city.
San Francisco is famed for being a city of the people. But those people are slowly being squeezed out, thanks to the success in the tech industry. And many of those same people don’t feel like they’re benefiting from the success in tech.
This all becomes scarier when you realise the tech industry resembles the dot com bubble of the late 1990s.
That first San Francisco tech bubble popped more than a decade ago. But the new one, despite the recent dips of Facebook and Zynga shares, promises to be even fatter—and potentially more damaging to the soul of the city. Once you start to look around, the warning signs are everywhere.
Whether or not there’s another tech bubble that’s about to burst, it’s clear San Francisco is having teething pains. Dealing with its oversaturation is a major priority for the city – and it’s a problem the politicians will have to solve if tech companies like Twitter start an exodus.
Who built the internet?
Who actually built the internet?
This is actually a controversial question, and there are two sides to the argument – those who stand on the side of government and those on private enterprise.
The debate has been sparked again by a comment from Barack Obama, who said: “Government research created the internet so that all the companies could make money off the internet.”
But a writer in The Wall Street Journal claimed this was just “urban legend” and credited companies like Xerox and Apple.
The answer, as it turns out, is much more interesting, according to this new piece in The New York Times.
Like many of the bedrock technologies that have come to define the digital age, the internet was created by — and continues to be shaped by — decentralized groups of scientists and programmers and hobbyists (and more than a few entrepreneurs) freely sharing the fruits of their intellectual labor with the entire world.
Yes, government financing supported much of the early research, and private corporations enhanced and commercialized the platforms.
But the institutions responsible for the technology itself were neither governments nor private start-ups. They were much closer to the loose, collaborative organizations of academic research. They were networks of peers.
If you’ve ever wanted to get a good history of the internet, this is a good start. You’ll find the answer is much more complicated than it is one-sided.
The environmental problem with data centres
Data centres. They’re one of the best investments a company can make, and yet have plenty of problems of their own – they’re simply sucking up too much power.
There’s been a trend among data centre builders to make them as green as possible. Apple’s recent construction project in North Carolina is powered by a giant solar farm.
But a comprehensive, one-year investigation by The New York Times has found many of these projects aren’t as environmentally friendly as it would appear.
Most data centers, by design, consume vast amounts of energy in an incongruously wasteful manner, interviews and documents show. Online companies typically run their facilities at maximum capacity around the clock, whatever the demand. As a result, data centers can waste 90 percent or more of the electricity they pull off the grid, The Times found.
It’s a shocking find. While energy efficiency can vary, consulting firm McKinsey found data centres only use about 6-12% of their energy to make computations.
“The rest was essentially used to keep servers idling and ready in case of a surge in activity that could slow or crash their operations.”
“This is an industry dirty secret, and no one wants to be the first to say mea culpa,” said a senior industry executive who asked not to be identified to protect his company’s reputation. “If we were a manufacturing industry, we’d be out of business straightaway.”
Data centres are a great investment. But they’re sucking up more power than they should be using – and that inefficiency won’t just affect the centres themselves, but their customers in the long run.