Thursday, May 9, 2013

May 8 - 2014 Chevrolet Silverado High Country  Silverado High Country is distinguished by a unique chrome grille with horizontal chrome bars, halogen projector headlamps and body-color front and rear bumpers. Inside, Silverado High Country features an exclusive saddle brown interior, with authentic materials throughout. Features include heated and cooled perforated premium leather front bucket seats with High Country logos on the headrests, Chevrolet MyLink connectivity with an 8-inch touch screen, Bose premium audio and front and rear park assist.
May 6 - 2013 Buick Riviera Concept  This stylish coupe combines avant-garde aesthetics and advanced technology, including plug-in hybrid electric vehicle propulsion, in a single package.
May 2 - 2012 Bertone Nuccio  The Nuccio is an "extreme", fully functional sports car, which develops the Bertone genetic code with an evolutionary flair: a 480 hp, 4.3-liter V8 mid-engine "berlinetta".
Welcome to Serious Wheels, the Internet source for high-quality classic car pictures, wallpapers, and information. From vintage vehicles to muscle cars to modern sports cars, you'll find it here. We are constantly updating the site with new automobiles, so bookmark us now for future reference.
May 6 - 2013 Porsche 911 Turbo & 911 Turbo S   The newest version of the iconic 911 Turbo features increased horsepower  (520 & 560 hp) while at the same time offering a 16% increase in fuel efficiency.  This widest of all 911s has a new AWD system with electro-hydraulic control as well rear axle steering for improved handling.  Porsche continues to push the envelope.
  • Android
  • Facebook
  • HTC
  •  


    HTC First phone
    AT&T has a special deal going on for the HTC First, a smartphone dedicated to Facebook users. You can pick up the handset for $0.99 with a two-year contract or pay $350 for a contract-free experience. While that sounds great for consumers, I wonder what it really means for Facebook’s biggest effort to date in trying to get a foot in the door of the hardware market.
    Facebook Android HomeIt’s not uncommon for handsets to see reduced prices over time. After all, new models appear, making older phones a little more obsolete. Carriers, which generally buy inventory in advance, then discount the older phones to spur higher sales and the service revenues that come along for the ride. But in terms of the HTC First, there is no successor model available.
    I reached out to AT&T for some thoughts and while the carrier won’t comment on individual manufacturer handset sales, I was told that it’s a promotion, which as I noted above, isn’t uncommon. There is no indication if or when the reduced prices may end.
    So this could be due to sales or not. I suspect it is, mainly because I’m ruling out the other options. For starters, the phone works on AT&T’s LTE network and falls back to speedy HSPA+ service, so there’s no reason to blame the network. As far as the phone: It’s a mid-range handset made by HTC that I’d consider fairly generic.
    My colleague Om reviewed it — I haven’t had a chance to use the First yet — and as someone who vastly prefers iOS to Android, his impressions were better than I expected. He mostly liked the Facebook Home software, which I have used. I think it’s actually very well done and runs nicely on my Galaxy Note 2. But I think this alludes to the key problem: The market is clamoring for a Facebook phone just as much as it is for a phone built around Twitter or another social service. Meaning: It’s not.
    It’s difficult enough for a high-end flagship phone to stand out from its peers, let alone a mid-range handset. Frankly, I can’t see how Facebook Home helps the HTC First differentiate itself enough; particularly when the software is already available for download on better phones and is expected to arrive on other handsets in the future. Sorry Facebook, I don’t think the market likes your attempt at a smartphone.

    Tesla shares soar almost 30% in after hours on profit news.


    The first Model S customer is driven off
    Tesla’s shares are soaring — even more than they already did this week — on news that the company has hit the milestone of delivering the first quarterly profit in the company’s history.
    Tesla after hours trading
    Tesla’s shares rose at one point almost 30 percent in after hour trading to over $70 per share. Earlier this week Tesla’s shares had hit an all-time high of over $60 per share.
    When Tesla held its IPO and started trading back in the summer of 2010, it went public at $17 per share. It’s $70-per-share milestone in after hours trading is more than four times that initial IPO price.
    Tesla and CEO Elon Musk have ambitions far higher than this current marketcap and stock price. Musk has a payout package that allocates shares when Tesla’s marketcap adds $4 billion up until it reaches $43.2 billion, along with accompanying operating milestones.

    Viral site BuzzFeed launched a new content vertical on Wednesday called “Community” that consists entirely of user-submitted content.
    While BuzzFeed has relied on reader content for years, the new vertical will increase the visibility of such contributions. It will also increase the chances of a viral pay-off from the site’s high-tech publishing tools. The new “Community” section includes a formal submission process that permits users to submit one post per day until their (what else) “Cat Power” increases, which will allow more frequent submissions.
    “Community has always been a huge part of our site — some of our best posts have come from community submissions — and now we want to reinvent community for the social web,” editorial director Scott Lamb said in an email statement.
    BuzzFeed’s decision to expand the scope of user-generated offerings comes at a time when media outlets are increasingly looking to commenters as a source of talent and future hires. My colleague Mathew Ingram explained the phenomenon well earlier this week in “Want a job at Gawker Media? You can get a head  start by being a regular commenter.”
    The new section is consistent with BuzzFeed’s improbable quest to become more serious and more inane at the same time. In recent weeks, the site has been at forefront of major news stories like the Boston bombings while also churning out its regular fare like “14 cats who think they’re sushi.”

     

    Wikipedia-logo-v2.svg
    Summary: Tweets and other public-facing data have limits, but researchers see new value in Wikipedia page views. That data could inform an investment strategy more closely tied to Dow Jones performance than a random strategy.
    Wikipedia-logo-v2.svg
    Plenty of companies have been looking at software for analyzing private large data sets and combining it with external streams such as tweets to make predictions that could boost revenue or cut expenses. Walmart, for instance, has come up with a way for company buyers to cross sales data with tweets on products and categories on Twitter and thereby determine which products to stock. Here’s another possible data source to consider checking: Wikipedia.
    No, this doesn’t mean a company that wants to predict the future should take a guess based on what a person or company’s Wikipedia page says. However, researchers have found value in page views on certain English-language Wikipedia pages. The results were published Wednesday in the online journal Scientific Reports.
    The researchers looked at page views and edits for Wikipedia entries on public companies that are part of the Dow Jones Industrial Average, such as Cisco, Intel, and Pfizer, (pfe) as well as wikis on economic topics such as capitalism and debt. Changes in the average number of page views and edits per week informed decisions on whether to buy or sell the DJIA. In other words, a major increase in page views could have prompted a sale, followed by a buy to close out the deal, or vice-versa (decreases in page views, say, would cause a buy, followed by a sale).
    The researchers compared this investment strategy with a random investing strategy. What they found is that returns based on views of the DJIA company Wikipedia pages “are significantly higher than the returns of the random strategies,” to the tune of a 141 percent return, according to a news release.
    How returns on strategies based on view and edit data for Wikipedia entries on companies in the Dow Jones Industrial Average, courtesy of Scientific Reports.
    How returns on strategies based on Wikipedia view and edit data for Wikipedia entries on companies in the Dow Jones Industrial Average, courtesy of Scientific Reports.
    There was also a significant difference between returns from the random strategy and the returns on the strategy tied to page views of economic topics. The yield would be 297 percent higher than what was put in in that case.
    Returns on strategies based on view and edit data for Wikipedia entries on economic topics, via Scientific Reports
    Returns on strategies based on view and edit data for Wikipedia entries on economic topics, via Scientific Reports
    To check that there wasn’t a hidden variable in the data on views of company and topic pages, the researchers compared the earnings on Dow Jones investments tied to page views of actors and filmmakers, which had just as many page views as the pages on the DJIA companies. Indeed, they found no statistical significance there. And that makes sense in theory — who checks out Matt Damon’s Wikipedia entry before making an investment? But checking a Wikipedia page on Cisco might be a more reasonable action before investing in Cisco.
    Incidentally, some of the researchers behind this project have also investigated connections between the Dow Jones and the use of certain financial search terms on Google. Other researchers have previously found connections between Google search patterns on stocks and stock price changes over time.
    While predictive analytics has become a hot area — with applications from social media conversations to crime, from the flu to retweets — data scientists often acknowledge that people need to be sure the data they want to use for analysis is solid and reliable. Edit data from Wikipedia isn’t inherently reliable in the sense that anyone can edit it — and it turns out to be not statistically significant. Page views could perhaps be manipulated by a computer pinging Wikipedia again and again, which could throw off an algorithm pulling page view data in real time.
    And tweets can be all over the place — there’s no style guide or fact checking for Twitter. So getting a good read on sentiment based on tweets from, say, Stocktwits can be hit or miss. And Google’s Flu Trends feature, heralded as an early use of crowdsourced data, reportedly overestimated flu breakout late last year.
    Clearly, there are caveats to these data sets. Still, it’s neat to see new models emerging on the uses of public data, and some people who want to make money off Wikipedia metadata might want experiment with it. Just don’t blame us if the experiments backfire.

    Google along with the US Geological Survey, NASA and TIME has shared a quarter-century of photos taken from space that show the surface of the earth and the changes that have happened over that period. From Google’s blogpost:
    We started working with the USGS in 2009 to make this historic archive of earth imagery available online. Using Google Earth Engine technology, we sifted through 2,068,467 images—a total of 909 terabytes of data—to find the highest-quality pixels (e.g., those without clouds), for every year since 1984 and for every spot on Earth. We then compiled these into enormous planetary images, 1.78 terapixels each, one for each year.
    The changes on our planet are stunning. A glacier has almost vanished, a new city has been created and Amazon rainforest has been decimated. Our capability as humans to destroy our planet and re-create it is astonishing. The whole project makes you realize that in order to understand something important and profound, you have to look at it over a period of time.
    Check it out on Google’s Timelapse website.
    googleearthpics

    Monday, May 6, 2013


    planetbroadband
    Telcos feel like they are between a rock and hard place. When you consider the transition to all IP networks, the margin pressures associated with meeting the insatiable demand for mobile data and the threat that over-the-top services represent to their businesses; it’s clear that they are doing more than just trying to change the jet engine mid-flight, they are trying to replace the engine while others are looting the plane for parts. Meanwhile the skies are getting more crowded with more flyers demanding more routes.
    Telcos must invest in their infrastructure, even as demand for their services rises. Yet they cannot ask revenue to continue rising at the pace of consumption, and in some cases, such as text messaging and voice calls their revenue is falling. So far their response has been to decry bandwidth hogs, implement new pricing plans that try to hold the line on the dollars coming in even if users choose to use over-the top-alternatives.
    airplane_thumbBut some are realizing that that’s not enough. They are investing in technologies such as OpenFlow, or at least software defined networking, as they try to get a handle on their costs. And they are demanding their suppliers provide them with specialized software running on commodity hardware, as opposed to the pricey, proprietary boxes of previous generations’ of technology.

    Metaswitch’s big switch for telco gear.

    For example, Metaswitch, a three-decade-old company based in San Francisco has created Project Clearwater, a software-based IMS core for telephone networks. An IMS (it stands for IP Multimedia Subsystem for those who care about these things) system is the glue that connects the old analog wireline systems to the newer digital systems. The thought behind IMS was that mobile operators would use them as a bridge into the IP world, but in reality they proved complex and expensive and telcos put off making those investments.
    As Metaswitch looked at the market two years ago it saw an opportunity. The company, which provides other hardware to wireless carriers, saw the world was changing. So CTO Martin Taylor said the company decided to build an IMS core that ran on commodity hardware. And if that wasn’t revolutionary enough (remember, we’re talking about telcos here) on May 8 Metaswitch will open source the software.
    metachart
    Taylor points out that telcos used to have the largest scale systems, but that is changing. The globe-spanning networks delivering five nines that once inspired such awe, are now common as Google, Microsoft and others build out their own globe-spanning infrastructures. And telco’s know that to keep up they must adopt the same tricks the web scale companies have, like open source software and commodity hardware. Thus Metaswitch will open source its Clearwater software, and follow a Red Hat model of supporting the software and releasing regular updates. Taylor has the right idea, but telcos need to go even further.

    But the real solution isn’t open-source software

    As forward-thinking as Metaswitch is with its open source business model and trying to deliver a software-based IMS core built for commodity hardware, its customers are making a mistake if they rely on Metaswitch to hold their hand. As the telco network looks more like cloud and webscale infrastructure — in that telco networks they are taking on more load without adding costs — telcos need to think like real cloud vendors and webscale companies, not like enterprise IT customers.
    Telcos are providing essential infrastructure in their mobile networks. Many of them also provide cloud computing services. In yesteryear it was enough to just provide the pipes, but if you’re going to provide compute and networking infrastructure today you need to adjust to the new reality for infrastructure providers.

    The new infrastructure reality

    223102_commodity_tradingAnd that reality is you need to own your systems. Infrastructure is going to be a commodity, even in mobile access (look at Free Mobile’s plans in France or even Republic Wireless here in the U.S. if you want to see the future). And people are going to want more and more of it, so the build out had better be cheap. So if telcos really want to be cloud providers, and the really want to compete in an all IP world, they need to stop demanding hand-holding from their vendors, hire smart people to own their infrastructure development, and get off their butts and start innovating.
    For example, Amazon doesn’t hire a company to provide help on its operating systems or databases. When it chooses an open source technology it also chooses and hires smart people to make sure that technology is up and running and maintained. Google, Facebook, Netflix, they all operate the same way in the core areas of their business. Because when you cut out the middle man you cut costs. When you have smart people on staff, you can keep innovating at your pace and in the direction you want to go.
    So if you’re going to be an infrastructure provider, that mindset and skill set is par for the course. And telcos do not seem to get this.
    They can’t say they want to be like Amazon and play in that world if they want their vendors to do the work. They’ve got to find a way to embrace not just the technologies but the economic realities of competing in the commodity and cutthroat business that is the cloud and IP networks. Otherwise they will begin a long decline