Posts Tagged ‘Microsoft’

Yahoo, Microsoft Search Renewal Could Help Portal Renew Its Own Search Tech

April 17th, 2015

Yahoo and Microsoft have agreed to renew their search alliance, but in a way that could help Yahoo renew its own search technology.Outsourcing30

In 2009 the two companies struck a deal that would have Microsoft’s Bing search engine exclusively provide the search results and automate search ad buys on Yahoo’s sites and Yahoo handle direct search-ad deals with advertisers. Industry experts viewed the deal as Yahoo — one of the most dominant early search engines before Google overtook the market — outsourcing its search technology to Microsoft.

Since former Google search exec Marissa Mayer took over Yahoo as CEO in July 2012, she has reportedly been trying to wiggle out of the agreement in order to reassert Yahoo’s search business. For example, the deal’s exclusivity clause only applied to desktop, so under Ms. Mayer, Yahoo was able to roll out an ad marketplace called Yahoo Gemini to automate the sale of Yahoo’s mobile search ads.

Now Yahoo has blown the loophole wide open. Under the renewal terms announced on Thursday, Yahoo’s deal with Microsoft is no longer exclusive on desktop or mobile. That means Yahoo could strike similar deals with other companies that own search engines like Google, Facebook or Amazon to also source their results within its search pages.

But perhaps more importantly, it means Yahoo can rebuild its own search technology. Google and Facebook have been able to build multibillion-dollar ad businesses that are bigger than Yahoo’s by being able to dig into all their user data to personalize search results and social feeds as well as the adjoining ads. Yahoo has been able to do that to a point, but operating its own search engine could give it more data and more ownership of that data.

It’s hard to say how positive the search alliance has been for Yahoo. Yahoo has credited it for 35% of the $4.62 billion in revenue it made last year, but Microsoft has had a hard time meeting certain agreed-upon revenue markers that has hurt how much money Yahoo makes per search query. And Microsoft’s and Yahoo’s combined share of the global search ad market this year will be only 6.5% compared to Google’s 54.5% share, according to eMarketer estimates.

A rebuilt Yahoo search engine could help the portal strike that search deal with Apple it’s reportedly been trying to negotiate that would have it oust Google as Apple’s Safari web browser’s default search engine, as Yahoo did with Mozilla’s Firefox web browser last year.

From a revenue perspective, Yahoo’s search business has been a rare bright spot. While the company’s display advertising business hasn’t had a quarter of year-over-year revenue growth since the third quarter of 2012, Yahoo’s search business has grown its quarterly revenue year-over-year for each of the past four quarters. If Yahoo is able to take advantage of the loosened search alliance, that trend could be expected to continue.


IT buyers calling it quits with Silicon Valley

November 13th, 2014

As enterprises discover the need to build their own software, they’re ending the romance with Silicon Valley vendors. Outsourcing32

Once upon a time, vendors developed and sold software and enterprises bought it. Companies like Oracle and Microsoft grew up in such halcyon days and minted billions of dollars in profits for their troubles.

That was then, this is now.

Today’s enterprise is increasingly assertive, building their own software and, in a small but growing trend, releasing it as open source and inviting others to contribute. Just ask the CEO of Under Armour.

“We are a software company”
Entrepreneur and investor Marc Andreessen was right to declare that “software is eating the world.” In a nutshell, Andreessen argued that technology was no longer something external to a business; it was core, not complement. As such, every company had to be in the business of developing innovative technology.

Under Armour CEO Kevin Plank got the memo.

In an interview with Businessweek, Plank insists that he has no plans to outsource technological innovation to Silicon Valley:

“I’ll be damned if I’m going to cede anything to Silicon Valley or any other technology company, because I believe we are a technology company. And if the phone is going to get integrated into the shirt, should that be a technology company making apparel or the apparel company starting to make technology? I choose the latter, and that’s exactly where I’m pushing my company.”
He’s not alone.

For example, while we like to talk about Hortonworks filing its S-1 to go public on the back of market adoption for Hadoop and other big data technologies, the more interesting news is “Barclays Bank…work[ing] with Commonwealth Bank of Australia on the development of open-source tools for analysing large data sets.” When banks or other IT buyers stop buying and start sharing between themselves, cutting out the vendor middleman, that’s a big market shift.

So big, in fact, that the impact of open source and other trends like cloud are dramatically changing how “software vendors” define their businesses.

As Redmonk analyst Stephen O’Grady notes:

“Vendors… will continue to point to ‘software’ as their primary revenue source. But the reality is that when successful companies say ‘software,’ they will actually mean software plus some combination of public cloud infrastructure, hardware/appliance, automated management/monitoring capabilities, hosted micro-services, and data enabled analytics. The majority of which is software, of course. Just not strictly software as we have been conditioned to think of it.”
Yes, there will always be a need for software vendors, because it simply won’t make sense for enterprises to build some software themselves. While it will differ from company to company, there will always be functionality that doesn’t generate competitive advantage for a company, making it ripe for outsourcing to a vendor.

However, far more software will need to be written in-house.

Increasing competition for developers
All of this means that the competition for developers is about to get even worse. Silicon Valley has been on a hiring binge for engineers for years, and that same trend is about to hit the rest of the world.

Asked about recruiting, Plank was candid about his quest to assemble a developer army:

“I’ll say this, sporting goods has not always had very intelligent people. We have not attracted the best and the brightest. They’ve gone to Silicon Valley. They’ve gone to Wall Street. Now, we’ve got a pretty powerful ecosystem that we can tap into, especially with the acquisition of MapMyFitness. I have 100 engineers, and a year ago, I had zero. I have nine Ph.D.s, and a year ago, I had zero.

“I hear people say, ‘I want to go work at Google,’ and I think ‘What are you going to do at Google? It’s a search engine.’ The ability to touch people and literally change lives is incredibly relevant in a consumer-products company.”

The good news implicit in Plank’s statement is that developers won’t need to leave the Midwest or Far East or wherever they happen to live. Increasingly, they’ll be able to build interesting software right where they are, without the headache commutes up 101 or the crushing mortgages of Los Altos.

In fact, as I’ve previously written, “The older the industry, the more profound the change big data [and a great developer] can make.”

Breaking down barriers
There has never been a better time to work in technology and, let’s face it, who isn’t in the technology business these days? Or rather, who shouldn’t be? Forward-looking companies like Netflix are already chest-deep in technology, laying waste to more laggardly competitors like Blockbuster. Going forward, companies that embrace technology as central to who they are will win, and everyone else will evaporate into Chapter 11 obsolescence.

That’s the good and bad news. It won’t be pleasant for companies ill-prepared to embrace technology. But for developers, this news is only positive. With companies as diverse as Google, Under Armour, and John Deere competing for their services, developers can write their own ticket while taking home a fat paycheck.


Microland bets on hybrid cloud to land large deals

August 27th, 2014

Bangalore-based Microland has embarked on a new strategy to get large outsourcing deals and has launched a new brand identity for itself.outsourcing18

This new strategy involves building on its capability of managing networks and other IT infrastructure of Fortune clients through a mix of private and public cloud, commonly known as hybrid cloud.

Hybrid cloud is a service that a combination of traditional IT, private, public and community cloud computing services, from different service providers such as Microsoft, Amazon and others. Companies like Microland take these offerings and integrate it with their existing business. Called version 4.0, Microland founder and MD Pradeep Kar told BusinessLine that it sees a multi-billion opportunity in this area, as companies are starting to consider vendors which have expertise in particular technologies.

In line with this, the company has come up with a new identity in its 25th year of operations. When asked about the impact of this new strategy on its business, Kar said the company will continue to grow three times faster than Nasscom’s projections for the 2015 fiscal. Nasscom estimates that the industry will grow 13-15 per cent. However, since it is a private company, Microland did not share revenue plans for the year.

Market for these services is growing, albeit not at a fast clip. The public cloud services market is expected to grow 17.8 per cent in 2014 to $153 billion, according to analysts. Forrester estimates peg the private cloud market at $15.9 billion, from $7.8 billion in 2011.

It has around 2,700 employees with 75 global clients and six global delivery centres – three in India and one each in the US, the UK and West Asia.


Why Microsoft and IBM Jumped

July 21st, 2014

It was generally a good week for the S&P 500, which edged 11 points higher as earnings season has been generally positive so far. S&P tech giants Microsoft and IBM were worth a closer look this week. Outsourcing29

Job cuts and Intel results driving enthusiasm for Microsoft
Microsoft, the software giant well known for its Windows operating system and Office productivity suites, announced that it will lay off up to 18,000 of its 127,104 employees. Approximately 12,500 of these positions will be from the software giant’s freshly acquired Nokia handset division.

Investors were pleased with this announcement, driving shares up by as much as 3.7% during Thursday’s trading session. Note, however, that Bloomberg reported these rumors on July 15 and Nomura’s Rich Sherlund had issued a note on July 11 claiming that layoffs were imminent, so this headcount reduction may have already been baked to some extent into the stock.

Further, chip giant Intel reported results that showed strength in the business PC space as well as the datacenter. This bodes well for Microsoft, as it is not only exposed to the PC side of things with Windows and Office, but it is also exposed to the datacenter with its multitude of server- and cloud-based products, such as Windows Server, SQL Server, and Azure.

Microsoft is slated to report its third-quarter results after the close on July 22.

IBM still a cash-generating machine
Technology giant IBM reported its earnings results after the close on Thursday. Revenue came in at $24.36 billion, edging out analyst consensus by $230 million. Earnings per share was $4.32, beating consensus by $0.03. Full-year earnings per share of at least $18 was reiterated, pushing past the $17.87 consensus.

While revenue growth for the technology giant has been elusive, with revenues down 2% in the most recent quarter (1% excluding the company’s divested customer-care outsourcing business), the company managed to drive diluted earnings-per-share growth of 42% year over year and net income up 28%. This net income growth appears to be driven by lower operating expenses (down 14% year over year) and slightly higher gross profit margins. Earnings-per-share growth outpaced net income growth, as the share count dropped 9% from the year-ago period because of buybacks.

Though IBM will eventually need to return to revenue growth if is to drive net income up meaningfully in the longer term (cost-cutting only takes you so far), the stock isn’t exactly priced for growth at just under 11 times this year’s expected earnings. Further, the consistent and aggressive buyback program will help drive earnings-per-share growth even if net income remains flat.

IBM’s shares finished the week up 2.4%.

Leaked: Apple’s next smart device

Apple recently recruited a secret-development “dream team” to guarantee that its newest smart device was kept hidden from the public for as long as possible. But the secret is out, and some early viewers are even claiming that its everyday impact could trump the iPod, iPhone, and the iPad. In fact, ABI Research predicts that 485 million of these devices will be sold per year. But one small company makes this gadget possible. And its stock price has nearly unlimited room to run for early in-the-know investors.


Telstra’s second shot at cloud

April 7th, 2014

While it might be Australia’s largest telco, Telstra alone cannot compete on cloud computing prices and features when up against the scale of Amazon, Google or Microsoft.outsourcing41

On that basis, Telstra’s new deal with Cisco Systems looks to be prudent.

Under the deal, Cisco Systems will build and operate a cloud computing service inside Telstra’s Australian data centres that is based on Cisco hardware (servers, network) and Red Hat’s supported version of the OpenStack cloud computing framework (albeit with a few minor alterations by Cisco).

These racks will form a node in a global cloud Cisco operates with other telco partners across the globe – what the Californian company brands ‘The Intercloud’.

Just about everything in the stack is built and managed by Cisco – all features, functions, patches – despite it residing in Telstra’s data centre. Telstra concerns itself with network routes in and out of the data centre, customer support and billing.

It has big implications for Telstra’s IT staff, many of whom were laid off in a recent outsourcing drive, and even bigger implications for existing customers of Telstra’s Cloud Services.

Economies of scale

Erez Yarkoni [pictured], former T-Mobile CIO and now Telstra’s director of cloud services, spent time with me late last week to explain his strategy.

Yarkoni arrived at Telstra in November 2013 – his immediate concern being how the telco could achieve scale from its cloud computing investments.

Telstra’s initial foray into cloud computing – initially dubbed ‘Network Computing Services’ and later ‘Telstra Cloud Services’, was powered by the telco’s investment in hardware from EMC and Cisco, software licenses from VMware and BMC and an application migration and integration partnership with Accenture.

Several other Australian service providers – notably Alphawest, Bulletproof and Melbourne IT – had also built Infrastructure-as-a-service plays on the back of VMware’s hypervisor.

But at least two out of the three have since abandoned these projects as they struggled to compete with the global scale of Amazon Web Services, Google, Microsoft and others. The price war between AWS, Azure and Google in particular has rendered the most aggressive cost models from domestic suppliers uncompetitive to the point where there is more money to be made reselling and managing services hosted on some of these services than attempting to compete head on.

Yarkoni told iTnews he landed on the Cisco deal after a global technology tour that encompassed visits with most of the major players including HP, Microsoft and VMware. His discussions encompassed technology roadmaps but also “cloud economics”.

“Infrastructure is a global scale game,” he told iTnews. “And what defines the infrastructure is the software that brings it all together. I could try and compete by hiring 2000 developers in Seattle as Amazon and Microsoft has, or I could embrace open source and do it myself. But to do it yourself – to build your own set of software development resources – you are quickly going to find yourself unable to compete with global infrastructure.”

Yarkoni’s theory is that customers are likely to want to operate in a hybrid state.

“I’m of the same opinion as many Telstra customers – they will want to have more than one cloud infrastructure in play. The world will be a hybrid world. We will concentrate on building the service-managed infrastructure – such that we could, if required, attach Amazon Web Services or any other API-compliant cloud underneath.

“It’s important to understand what cloud economics mean,” Yarkoni said. “IaaS is a lean margin, global service. We need to graduate to building value on top of that infrastructure.”

Cisco’s Intercloud is built on technology the vendor already uses to manage its own WebEx Squared workloads, among others, allowing for a fairly aggressive roadmap.

Telstra already has racks being installed running the Intercloud software – specifically Red Hat’s version of OpenStack with Cisco’s management layer running over the top.

The next job is for Cisco and Telstra to build the necessary layers for the service to be commercially multi-tenant. By late May and early June, Yarkoni hopes to offer a platform in Alpha state, with customers being asked to “run  some workloads and hammer the alpha environment” for stability while Telstra builds the billing interfaces to sit atop it.

In the latter part of 2014 Telstra will offer a beta service to customers for dev and test use, with the billing service now live, with full production ready before the close of the year.

Yarkoni agrees that its an aggressive timeline, but counts on Cisco and Red Hat to bring the necessary skills to the table to get the job done in a timely fashion.

I asked him if there were enough skills around OpenStack in Australia to do the job.

“That’s a great question,” he said. “Our aim is to build this with Australian hands, provide Australian support but use global R&D. I believe we will find the skills, if not [our partners] will import them.”

What does it mean for existing customers?

While the final deal is yet to be inked, its expected Telstra will have Cisco’s Intercloud exclusively in Australia for somewhere between one to two years.

Initially, Telstra intends to continue to offer its existing VMware-based ‘Cloud Services’ stack to customers as Intercloud builds up momentum. Yarkoni wasn’t keen to commit to its future development.

“We will respect all existing contracts, and support the features and functions of the last version [CSX],” he said. “And we’ll continue to actively sell it to customers that require dedicated hosting but want to run it on utility infrastructure.

“Over time, what I expect to be a journey of a couple of years, customers will either choose to – or we will help them – migrate to OpenStack with KVM or even with VMware if you have to. We’ll have all sorts of other flavours of cloud underneath.”

Yarkoni agrees that supporting multiple hypervisors or cloud architectures comes at a cost to Telstra. The beauty of OpenStack is that Telstra – and in turn its customers – will be able to choose whether or not to pay VMware’s software licensing fees or use a freely available option. Yarkoni said that if Telstra gets the service management layer right, what sits beneath will be less of an issue.

“If you walk into an Amazon data centre, you will find different versions of what Amazon considered their standard compute over time. But it all still works. Similarly, at some point, Apple decided to put Intel chips in laptops. It didn’t especially matter that you used a MacBook or a MacBook Pro, or that your computer came with one chip or another. Apple decided the economies of scale using Intel chips was better than designing their own.”

For that reason, cloud services built on Cisco’s Intercloud are likely to be branded ‘Telstra Cloud Services’ just as existing services are today.

A new service, and a new customer proposition

Yarkoni hopes that the Intercloud deal will bring about one key architectural difference between existing Telstra cloud services and those that will be on offer by the end of the year.

“Today, the elasticity we call ‘cloud’ exists only up to the edge of the data centre,” he said. “Where it needs to work is all the way to the edgepoint device where a customer consumes a service.

“What we have had to do in the past is to build a service layer that allowed us to address firewalling, routing, VPN, private networks, service instantiation, and all the move adds, and changes associated with those network issues. What we intend to build now is a service that automates this all the way to the customer.”

Telstra is also looking to incorporate network APIs (application programming interfaces) that allow a customer’s application hosted on the Telstra cloud to adjust the size and quality of the network pipe required.

“It would be ideal to say to customers: for certain apps, you no longer have to plan to a peak network capacity. You can have the app ask for capacity on the go. Or you can schedule a batch job to only run during off-peak rates.”

The Intercloud deal will – if customers subscribe at scale – elevate Telstra as a service management and IT operations channel.

“If we have the tools, and the people with a DevOps mindset, we can drive better economies of scale and operational excellence,” Yarkoni said. “We can bring the promise and elasticity of cloud without Telstra or the customer needing to over-invest in capital or planning for peaks.

“Then we can graduate conversations with our customers – how do you deliver that to my applications, the IT that supports my business? Sometimes that is a SaaS conversation, other times it is about bringing elasticity to the application – we pre-configure the infrastructure such that it becomes the click of a button in a catalogue.”


Infoline successfully deploys Microsoft System Centre for OETC

January 7th, 2014

Infoline, Oman’s leading business process outsourcing (BPO) and IT enabled service (ITES) provider, has successfully implemented System Centre outsourcing27Service Manager, Operation Manager and Configuration Manager 2012 for Oman Electricity Transmission Company (OETC). This is the most cost effective, flexible and comprehensive management platform which will enable OETC to more easily and efficiently manage their IT environments, including their server infrastructure, private and public clouds and client devices. It will help in increasing productivity, improve resolution time, improve manageability of the IT asset life cycle and ensure software license compliance reduce infrastructure costs, keep track of hardware & software inventory and will ensure visibility in IT environment with a daily health report.

It will help OETC in automation, integration and centralising their IT infrastructure. It is for the first time that Microsoft System Centre Service Manager 2012 has been implemented with bilingual interface. Infoline’s consultants facilitate every client with a thoughtful perspective and empowering approach to drive results. By leveraging the years of experience and proven methodologies, they bring value to any project with their depth and breadth of knowledge. In addition to optimisation services, Infoline’s consultants develop a gap analysis and technology roadmap that provides plans for migrating to a newly optimised platform, helping their clients to leverage their return on investment (ROI) and reduce the total costs of ownership.

‘Our IT infrastructure is currently managed professionally with Microsoft, as it will add value to our environment, the productivity and work process will be improved and all daily work will be documented and easy to be monitored and control. We have selected Infoline for this implementation as they have proven track of successful implementations of Microsoft products and have excellent resources for continual support”, said Hilal al Shukri, IT Manager of OETC.


Cloud computing, Microsoft, the Dutch regulator and financial services

July 31st, 2013

The financial services sector throughout the EU wants an answer to the question when and to what extent can we enter the cloud?

A major hold-up however continues to be a dispute over the correct interpretation to be given to the provisions of the EU Markets in Financial Instruments Directive (MiFID) which require regulators to be given ‘effective access’ to ‘data’ and ‘premises’ in many instances where data processing is outsourced.Outsourcing20

Initially, regulators expressed caution and proposed that effective access to premises equates only to physical access to all data centres upon which cloud processing takes place. Recently though, there has been some suggestion that there may be another way forward.

Earlier this year reported on an ‘understanding’ or ‘agreement’ reportedly reached between cloud provider Microsoft and the Dutch Central Bank. The agreement had been described as “setting a precedent” and suggestions were made that the agreement might benefit not only the Dutch regulator but other regulators as well. was interested to know what this meant in practice and what the agreement looks like on paper. We asked a representative of the Dutch Central Bank for a copy of the agreement between it and Microsoft but were told that one could not be provided to us.

A spokesman for the Bank confirmed though, that there is no ‘template’ clause agreed between the Bank and Microsoft setting out audit rights in favour of the Bank which may be inserted into Microsoft’s contracts with regulated
Dutch financial services firms. But he also confirmed that Microsoft has agreed that the Bank can “visit Microsoft at any moment” in order to check the data belonging to financial services companies under the terms of specific contracts. How that agreement is precisely termed was not shared with us, but the spokesman said that it allows Dutch firms to meet their requirements set out in regulations under the Financial Services Act in the Netherlands – the legislation that transposes the MiFID rules into Dutch national law.

According to Microsoft, this development demonstrates that not all clouds are equal. “There is a misconception that cloud solutions create insurmountable challenges to maintaining regulatory compliance, this misconception is prevalent in the financial services sector and other regulated sectors,” according to Dervish Tayyip, EMEA Legal Director for Microsoft.

“We think the opposite can be true, if customers choose their cloud vendor carefully. The cloud can provide all organisations the economic and technical benefits of cloud services while maintaining compliance with regulations”, he said.

While the agreement is a step forward for Microsoft, suppliers having to negotiate a right for every regulator in every jurisdiction to access every premise upon which every cloud provider processes data does not seem to us to be the most effective way forward. In our view more discussion and debate is needed over the correct interpretation to be given to the provisions of MiFID and implementing laws which require ‘effective access’ to premises.

Do on-site audits really give regulators more visibility over the quality of processing activities undertaken by an outsourcing provider? Even if in a perfect world they may, do regulators really have the resources to enable them to effectively inspect cloud resources located outside of their own jurisdiction?

The best way forward may not be at the negotiating table but through a robust legal discussion which backs the idea that effective access to premises may mean digital and not physical access.
We are interested to hear your thoughts.


Protected by تهنئة
Get Adobe Flash player