Search This Blog

Sunday, December 9, 2012

EE ups data allowance on 4G mobile broadband

EE has upped the data allowances for its mobile broadband deals without increasing the price.

EE, which provides the only 4G network in the UK, has increased its 2GB tariff to 3GB, the 3GB limit to 5GB and the 5GB plan to 8GB, with the prices remaining at ?15.99, ?20.99 and ?25.99 per month on 18-month contracts – including a free dongle.

It has also introduced more choice for the lengths of contract, starting from 30 days and going up to two years. Prices for these new plans will start at ?12.99, but this would only include 1GB of data.

EE unveiled a new SIM only deal for smartphones with 4G capabilities as well – a monthly rolling contract for ?15.99, including 5GB of data. Anyone who signs up to this deal before 23rd December will also get the first month for free.

4G coverage is still patchy, however, as there are currently only 11 cities in the UK with access to the network – London, Manchester, Bristol, Birmingham, Cardiff, Edinburgh, Leeds, Liverpool, Sheffield, Glasgow and parts of Southampton.

Another five cities are due to go live at the end of the year – Belfast, Derby, Hull, Nottingham, Newcastle – but a source familiar with the situation said there will be a number of extra locations added in the first quarter of 2013, starting with St Albans.


Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy$("a#eproductLogin").attr('href', function(i) { return $(this).attr('href') + '?fromURL=' + regFromUrl; });

G-Cloud success depends on champions and a change in attitude

Speaking at Business Cloud Summit 2012, Denise McDonagh, director of the government's G-Cloud programme, called for public sector champions to push forward adoption of the G-Cloud.

G-Cloud is in its third year of development. "We have to make it sustainable," McDonagh told delegates.

Systems integration, accreditation and teething problems around how suppliers invoice still need to be sorted out. But the biggest challenge is convincing government departments that the G-Cloud works.

She admitted that to make the public cloud a reality, there needed to be more clarity. In particular, she pointed out that government departments are often hampered by red tape, and work in a traditional “nine-point” procurement plan that needs a 100-page requirements document.

“We need to challenge resistance. G-Cloud is legal, and is very competitive,” said McDonagh. 

Even where bespoke software is used, such as the case management system at the Home Office, the IT infrastructure needed to run it can be cloud-based, she said.

Make selection easierChallenge red tapeTarget C-level and doersClarity on policy

"Storage is storage – I might have bespoke applications, but we are using infrastructure as a service,” added McDonagh.

She said that some people in government did not understand what could be commodity through cloud services, pointing out that even if security is a top priority, there is no reason for 10 departments to get different quotes from the same set of suppliers for a secure desktop. 

“In government, a lot of us do the same things. Why develop your own secure email if there is a secure email cloud service?” said McDonagh. “If it can be commoditised, then it should be in the cloud.”

She said that cloud champions were needed to unlock its potential.

“One of the best things about the cloud is you can buy something for a few months, and if you don't like it you can buy something else, or try before you buy,” she added.

The cloud requires IT integration, an IT function that has traditionally been outsourced to large service integrators, but government departments need to take more control over this, said McDonagh. 

“We need to take accountability for integration. You never truly outsource the risk, so why pay suppliers to take on the risk?” 


Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy$("a#eproductLogin").attr('href', function(i) { return $(this).attr('href') + '?fromURL=' + regFromUrl; });

Saturday, December 8, 2012

Deloitte and KPMG face litigation over HP/Autonomy deal

Two of the big four accountancy firms have been named in a new lawsuit claiming damages for HP shareholders following the write-down of Autonomy.

HP bought Autonomy in 2011 for $11.7bn, but last week claimed former managers at the UK software firm had used “accounting improprieties, misrepresentations and disclosure failures to inflate the underlying financial metrics of the company” to boost its price before the acquisition went through.

As a result, it wrote down the price of the asset by $8.8bn and referred the accused managers to the US Securities and Exchange Commission (SEC) and the UK Serious Fraud Office.

Mike Lynch (pictured), founder and former CEO of Autonomy, has denied all of these claims and in an open letter to HP said he was "shocked and appalled" by the allegations.

However, a stark response from HP said it would not discuss the case with Lynch further until the two parties were in court, where he would have to "answer questions under penalty of perjury".

Autonomy was regularly audited by Deloitte, and HP boss Meg Whitman said her firm used KPMG to check due diligence on these reports.

Philip Ricciardi, who has invested in HP since 2007, has filed a court case at the District court for the Northern District of California in San Jose against all three companies, as well as HP CFO Catherine Lesjak and former HP CEO Leo Apotheker.

The filing accuses the defendants of failing to conduct due diligence, and as a result costing HP shareholders billions when it "grossly" overpaid for Autonomy.

This is not the first court case raised by HP investors.

A class action against HP was filed in the same court earlier this week by investors wanting to seek damages after claiming HP was trading on an inflated stock price between the write-down of services firm EDS this summer and the write-down of Autonomy.

More cases are expected from investors, alongside the legal action HP is taking out against the former management of Autonomy.


Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy$("a#eproductLogin").attr('href', function(i) { return $(this).attr('href') + '?fromURL=' + regFromUrl; });

AWS launches new EC2 instances and Data Pipeline for big data analytics

Amazon Web Services (AWS) has launched two new EC2 instances for applications and analytics workloads, as well as AWS Data Pipeline, a web service that allows enterprises to move data across various systems

Launching the new Elastic Compute Cloud (EC2) instance types at re: Invent, its first user conference, AWS chief technology officer, Werner Vogels said they will help enterprises build new high-class applications quickly.

The two new instances designed for analytics are cluster high memory EC2 instance and high storage EC2 instance.

“For those enterprises that have been struggling to do very large scale analytics until now, high storage EC2 instance type is for you,” Vogels said.  The new storage instance provides users with 48TB of capacity.

And cluster high memory is aimed at enterprise users that build applications requiring large scale memory, he added.

Vogels also launched Amazon Data Pipeline that will help enterprises create automated and scheduled data flows.

Data Pipeline is a data integration cloud service for business intelligence and will help organisations automate big data workflows, Vogels said.

“Data Pipeline is pre-integrated with existing AWS data sources and connects with third-party and on-premise sources,” Vogels said. 

The service was demonstrated by AWS chief data scientist Matt Wood at the re: Invent conference. Wood showed the simple drag-and-drop interface that allows users to create a data pipeline and schedule data-intensive programs.

According to Kyle Hilgendorf, the principal research analyst at Gartner, the Data Pipeline user interface is “clean and simple”. “I hope the AWS Data Pipeline GUI is a look into the future of the AWS management console,” he said.

The data service can also be used to create daily and weekly analytics report for data analyses.

 “One common customer request we always get is ‘how do I automate replication of database Dynamo DB to Amazon S3’, and Data Pipeline will help enterprises do that,” Wood said.

As there are disparate data collection systems on the cloud such as DynamoDB, Amazon Simple Storage Service (Amazon S3), Amazon Elastic MapReduce (EMR) and now Amazon's new data warehouse service Redshift, it is a challenge to integrate all the data from all these sources, Wood explained. 

“Data Pipeline would help enterprises overcome that big data challenge and consolidate all the disparate data into one place,” he said.

In his keynote, Vogels shared his vision of the 21st century applications and IT architecture. “New applications must be resilient, data-driven, adaptable and controllable,” he said.

Elaborating on these four components, Vogels said that 21st-century IT architecture must be “cost-aware” and be built with cost in mind (controllable). He also said that enterprises must constantly inspect the whole application distribution chain and put everything in logs (data-driven).

“There are always code failures. Don’t think failures are exceptions and that’s why you should think of resiliency in the applications you build,” Vogels added.

And lastly, he urged enterprises to make no assumptions and advised them to be adaptable at a time when technology is changing at a fast pace. 

“Don't become attached to your IT infrastructure. Servers won't hug you back,” he said. 

He also spoke about cloud security. “When Amazon.com decided to move all its services to AWS cloud, we decided to encrypt everything – the data that is in transit as well as the data that is at rest,” he said.

Enterprises should think about integrating security into their apps from grounds-up, according to Vogels.

“In the old world, everything was resource-focused. In the new world, everything is business-focused and enterprises must think of IT architecture and applications from a business point of view,” he said.


Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy$("a#eproductLogin").attr('href', function(i) { return $(this).attr('href') + '?fromURL=' + regFromUrl; });

Friday, December 7, 2012

AWS re:Invent: Amazon launches data warehouse service Redshift and cuts S3 prices

Amazon Web Services (AWS) is making a foray into the data warehousing segment with the launch of Redshift, a cloud-based data warehouse service. 

It announced the Redshift service, along with Amazon S3 price cuts, at its first ever user and partner conference, AWS re:Invent, in Las Vegas.

“Traditional data warehouse products are too expensive and have licensing complications,” said AWS senior vice-president Andy Jassy (pictured). “Many large enterprises told us they are unhappy with the existing data warehousing services in the market.”

Launching in 2013, AWS Redshift will become a fully managed, petabyte-scale data warehouse service in the cloud. It will help enterprise IT automate labour-intensive tasks such as setting up, operating and scaling a data warehouse cluster.

It will also help them to provision capacity, monitor and back up the cluster, as well as apply patches and upgrades.

Amazon Redshift is designed for developers or businesses that require the full features and capabilities of a relational data warehouse, said Jassy.

An average enterprise typically pays between $19,000 and $25,000 per terabyte per year on data warehousing. With Redshift, the average cost per terabyte per year would be $1,000

It is certified by Jaspersoft and MicroStrategy, and will work with most business intelligence (BI) tools, including SAP Business Objects and Cognos among others, with additional BI tools to be added later.

“When customers complained to us about existing products, we wanted to do data warehousing services on the cloud AWS style,” said Jassy. “It should be easy to provision, easy to scale, open and flexible, and inexpensive,” he said.

Today, an average enterprise typically pays between $19,000 and $25,000 per terabyte (TB) per year on data warehousing. With Redshift, the average cost per terabyte per year would be around $1,000.

In addition to cost savings, Redshift will offer 10 times faster performance and will be a “game changer” in the data warehouse space, said Jassy.

Amazon’s cloud-based service for the data warehouse space marks its venture into a new segment. It already provides infrastructure-as-a-service (IaaS) services; cloud-based storage services, with Amazon Simple Storage Service or Amazon S3); and cloud-based database services, with Amazon RDS.

Redshift is due to launch next year, but it is already used by the retailer itself and other enterprise customers, including Netflix, Nasa and Flipboard, under AWS’s private beta programme.

"At Netflix, we deliver personalised recommendations for millions of subscribers by analysing large volumes of data, and are always looking for ways to improve our service," said Kurt Brown, director, data science and engineering platform, at Netflix. "The cost-disruptive and cloud-based model of Amazon Redshift will shake up the data warehousing industry,” he added.

Another big highlight of the AWS re:Invent conference was the price reduction of Amazon S3. “We are lowering Amazon S3 pricing by about 25% across the board, effective from 1 December 2012,” said Jassy.

Amazon S3 is AWS’s web services interface that enterprises use to store and retrieve any amount of data, at any time, from anywhere on the web.

The cost-disruptive and cloud-based model of Amazon Redshift will shake up the data warehousing industry

Kurt Brown, Netflix

Under the new pricing, customers will be able to store data in the cloud for about nine cents per gigabyte. S3 currently stores 1.3 trillion objects and receives 800,000 requests per second, according to AWS.

Jassy said the company is passing its lower overall cost of storage and better economies of scale on to customers, and has "made 23 price reductions since 2006". 

AWS is being aggressive in its push into the enterprise segment by venturing into new markets and launching steep price cuts, Michael Barnes, research director at Forrester told Computer Weekly. 

“It is carrying forward its focus on low-margin, high-volume strategy,” he added.

AWS re:Invent, Amazon’s first ever user conference, has been all about capturing more of the enterprise market share, said Barnes. 

Among other highlights, AWS reiterated its commitment to cloud security at the user conference and spoke about its new services, including Amazon Cloud Search and Amazon Glacier, its low-cost cloud storage service for data that can tolerate a three- to five-hour retrieval time.


Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy$("a#eproductLogin").attr('href', function(i) { return $(this).attr('href') + '?fromURL=' + regFromUrl; });

Nokia wins patent case against RIM

Nokia has won its dispute with Research In Motion (RIM) after a year of fighting over its patent-licensing agreement.

Nokia and RIM have a history of licensing patents to one another, and agreed a cross-license in 2003 for standards-essential cellular patents.

However, in 2011, RIM claimed the licence had “extended beyond cellular essentials” and took the case to arbitration to renegotiate its deal with Nokia.

The move backfired though and the arbitrator ruled RIM was in breach of contract for manufacturing and selling WLAN (wireless local area network) products without first agreeing royalties with Nokia.

Things are now set to get worse for the BlackBerry maker, as a spokesman from Nokia told us the company has filed law suits in the UK, US and Canada “to enforce the tribunal’s ruling” and sue for breach of contract.

Computer Weekly contacted RIM to ask if it was planning to appeal or had any comment on the case, but a spokeswoman from the firm declined to comment.

The ruling comes in the same week Ericsson has decided to head to the courts over its licensing agreement with Samsung.

Ericsson and Samsung have been in negotiations for over two years about a patent deal that should see Samsung paying licenses to Ericsson. However, when the Swedish firm put the prices up, Samsung refused to pay, saying they were “prohibitively higher.”

Now, Ericsson is taking legal action and the decision will be made in a US court.  


Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy$("a#eproductLogin").attr('href', function(i) { return $(this).attr('href') + '?fromURL=' + regFromUrl; });

Thursday, December 6, 2012

Mobile channels: The right and wrong way to reach busy customers

With as many as 110 million Americans using smartphones, businesses are increasingly throwing money at mobile applications to better serve their customers.

And with mobile channels offering a whole new way to connect customers and contact centers, it's no wonder, said Brad Cleveland, a consultant and former president of the International Customer Management Institute. Mobile, he said, is driving "a higher level of call" in the contact center.

Armed with mobile technology, customers can reach out to a contact center any time of the day, from anywhere. They don't have the patience to wait for an answer, Cleveland said. In today's instant-gratification society, customers reach out "in between meetings, just before a flight, standing in line somewhere … that’s hugely important to callers. Expectations have gone up significantly."

Mobile channels open new windows

And so has the pressure to keep customers happy. One way for companies to meet such high expectations is to take up mobile use themselves, said Paul Greenberg, president of The 56 Group. Take tablets, for example. The devices have given the mobile channel a starring role in CRM strategies. They allow customer service agents and field technicians to stay connected with each other. That’s important for companies that rely on stellar reputations with in-person customer service.

"With any sort of connectivity, [technicians] can get access to field manuals, best practices, communicate with headquarters and do the billing," he said. "They can help the customer where the customer is at."

Social media applications have also pushed mobile CRM to a higher standard, Greenberg said. If someone sends out word of a bad experience with a company through Twitter, that message can be seen by millions of people around the world who could make judgments about the company.

At the same time, social media applications allow companies to respond quickly and briefly to a customer complaint.

"You have the option to interact that way, and that’s useful," Greenberg said. "If I’m sitting somewhere and don’t really want to talk on the phone, I can still do this. It gives me the ability to communicate my issue and remain silent at the same time. That’s important."

Trouble in mobile paradise

But the prevalent use of mobile hasn't led to an equal boost in customer service--at least not yet, according to several analysts who focus on contact centers.

Donna Fluss, founder and president of DMG Consulting LLC, said mobile applications often serve basic purposes, but many of them have not been fully connected to help customer service agents do their jobs.

"Mobile apps have the potential for automating more calls, but when a call needs to get to an agent, it becomes a little more complicated," Fluss said. "Some will allow the call to transfer to an agent with the indicative information, but other apps don’t have that capability. Customers have to repeat themselves, and that's not a very good experience."

Elizabeth Herrell, an analyst at Constellation Research Inc., said that incomplete mobile applications pose a problem.

If customers aren't sure their transactions were processed by a mobile application and want to talk to an agent, some apps offer only a list of phone numbers to call, she said. That puts the customer back at square one, which is what companies are trying to avoid.

Not integrated? Not so fast

The trick is integration. Mobile applications must be fully integrated with a contact center to have a substantial effect on customer experiences, Herrell said. Apps that integrate with contact center technologies and mobile support allow agents to respond to questions intelligently and contextually--that is, without requiring the customer to tell his story again and again. Companies that achieve integration improve customer experiences and also make busy agents' lives easier.

But those applications are newer, and no more than 10% of companies have adopted them, Herrell said. "Companies with aging infrastructures, they’re not fully up to speed on all these channels."

In falling behind, companies not only miss an important opportunity to build customer relationships, they fail to close the growing gap between company capabilities and customer expectations.

But there's still time: The contact center environment is changing rapidly. "We’re really at the beginning of this kind of thing," Fluss said.


This was first published in November 2012

Syria cut off from internet

The internet in Syria has been taken down by an as-yet unidentified source, leaving 20 million citizens offline.

Internet intelligence firm Renesys was the first to report the issues after it discovered 77 network outages at 10:26am UK time, which accounted for 92% of the networks routed through the country.

Renesys, which runs a real-time global sensor grid to monitor and analyse internet routing data, later reported the whole internet was down as none of 84 blocks of IP addresses allocated to Syria were contactable.

“In the global routing table, all 84 of Syria's IP address blocks have become unreachable, effectively removing the country from the internet,” wrote Jim Cowie, co-founder and CTO of Renesys, on the company’s blog.

“We are investigating the dynamics of the outage and will post updates as they become available.”

Computer Weekly tried to contact the Syrian Embassy to find out the reasons for the outage but was unable to find a spokesman.

Emirates airline also confirmed to Computer Weekly that all flights out of the country have been suspended with immediate effect due to “political unrest,” although no link between the two has been confirmed.

The UK removed its embassy from Syria last year because of security and still advises UK citizens not to travel to the country under any circumstances. 

The Foreign Office was also unable to comment on the current situation.


Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy$("a#eproductLogin").attr('href', function(i) { return $(this).attr('href') + '?fromURL=' + regFromUrl; });

Carlson Wagonlit Travel hires new CIO

Global travel firm Carlson Wagonlit Travel (CWT) has appointed Kevin O'Connor as its executive vice president and chief information officer.

O'Connor, formerly CIO at online gambling firm Paddy Power and, more recently, at the technology arm of NYSE Euronext, has the task of delivering innovation through IT to his new employer.

In his new role, O'Connor is a member of the CWT executive team and reports to Douglas Anderson, the company's president and chief executive. In a move coinciding with O'Connor's appointment, the company has also consolidated its IT and product delivery teams under the CIO role.

According to Carlson Wagonlit, the company is also adapting to the ways in which travellers use technology within business travel and is finding that IT is a crucial component of driving innovation.

“The industry is going through a big evolutionary step with IT not only playing a key role in integrating and automating the many global systems and processes more effectively, but where the innovative and carefully targeted application of technology can and will create clear differentiation between the major players’ offerings to their clients," O'Connor said.

"It is a great opportunity to be part of this market-leading company as it applies the best technologies and practices"

Prior to joining Paddy Power, O'Connor held a number of senior IT roles at financial services firms such as MSCI Barra, Morgan Stanley and Deutsche Bank.


Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy$("a#eproductLogin").attr('href', function(i) { return $(this).attr('href') + '?fromURL=' + regFromUrl; });

Wednesday, December 5, 2012

Cybergeddon unlikely, say experts

Cybergeddon, a complete take-down of the internet, is undoubtedly within the power of some nation states, but is unlikely, say security industry experts.

Bringing down the internet would not be in anyone’s interest, said Fred Piper, security consultant and former director of the information security group at Royal Holloway, University of London.

While we have seen cyber assaults of this sort on Estonia, nation states and even cyber criminals are unlikely to bring down the internet as they rely on it, Fred Piper told a roundtable discussion in London.

However, critical national infrastructure is extremely vulnerable to state-sponsored cyber attack and considered to be a top risk by the US government, said Hugh Thompson, chief security strategist for security firm Blue Coat Systems.

The growing number of attacks against embedded systems demonstrated by security researchers and evidence of these kinds of attacks in the past 24 months should be a strong wake up call, he said.

“Attackers may be using simple social engineering techniques to get inside networks, but the methods they are using to move laterally once inside, are extremely sophisticated,” said Hugh Thompson.

As with nuclear weapons in the past, the world is starting to engage in a cyber arms-race, where everyone wants powerful cyber weapons they hope they will never use, said Paul Simmonds, co-founder of the Jericho Forum and former CISO of AstraZeneca and ICI.

“The cyber capability of nation states that has been publicly revealed in the past year, and even in critical national infrastructure discussions, is probably only the tip of the iceberg, with most of the true capability remaining hidden,” Paul Simmonds said.

Nation states are unlikely to use zero-day threats they may have developed, said Thompson, because once they have been used, they will no longer be effective.

“Once an attack method is known, it is more difficult to repeat, because people adapt to what has happened,” he said.

For this reason, said Simmonds, it is worrying that many countries are rushing headlong into connecting elements of critical infrastructure to the internet to cut costs and for ease of use.

The UK is publicly investing a lot into assessing its cyber defence capability and identifying the gaps, so it is not all gloomy and awareness is increasing, which is good, said Piper.

However, he said it is unknown if this is moving fast enough and it also needs to be collaborative at an international level if it is to be effective.

“Hopefully, in 2013 we will see more momentum in international collaboration around issues such as bringing attackers to justice no matter where they are operating from,” said Piper.

This is particularly important in light of the fact that the ability to attack critical infrastructure is not necessarily limited to state-sponsored attackers, said Thompson.

With much lower barriers to entry through widely-available attack toolkits, individual attackers can be as powerful as nation states and cause an enormous amount of damage, he said.   

However, according to Simmonds, any major halt to critical infrastructure or the internet is more likely to be the unintended result of other attacks or failures, than a deliberate strike.

Cyber weapons are also unlikely to be used in isolation; they are far more likely to be part of a much wider military campaign, said Thompson.

Looking ahead to 2013, he said organisations should be more concerned about targeted attacks and ensure they are prepared to detect and mitigate these.

“Targeted attacks using freshly compiled code are reaching epidemic proportions, said Thompson, but because few organisations are reporting such attacks, they are not being taken as seriously as they should by many businesses.


Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy$("a#eproductLogin").attr('href', function(i) { return $(this).attr('href') + '?fromURL=' + regFromUrl; });

RBS launches digital wallet trial

RBS is the first bank in the UK to adopt Visa Europe’s digital wallet with an initial roll-out of technology that the bank believes will make it easier for customers to securely buy while on the move.

Visa Europe’s V.me digital wallet was made available in the UK in May. 

It provides Visa and non-Visa customers with a secure online point-of-sale (POS), which can be accessed via PC, tablet and mobile devices as part of a digital wallet service and will eventually be incorporated in all Visa Europe’s new payment technologies.

RBS is initially rolling out a trial, ahead of a full roll-out next year that will make it available to all RBS and Natwest customers.

Steve Rubenstein, director of everyday banking at RBS, said the proposition reduces the effort of buying things online, while also improving security, for customers.

The number of customers using smartphones to make purchases is soaring, according to research for the IMRG Capgemini e-Retail Sales index. It found that mobile retail has so far averaged 300% year-on-year growth for the first quarter of 2012. 


Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy$("a#eproductLogin").attr('href', function(i) { return $(this).attr('href') + '?fromURL=' + regFromUrl; });

ICO fines text spammers Tetrus Telecoms £440,000

The Information Commissioner’s Office (ICO) has fined a marketing firm ?440,000 for spamming its customers with text messages.

Tetrus Telecoms, owned by Christopher Niebel and Gary McNeish, has been under investigation for 18 months after the ICO received reports the firm had been sending large volumes of unsolicited text messages from its two offices in Stockport and Birmingham.

The texts didn’t identify the sender or gain the recipient’s permission. This means Tetrus Telecoms broke the Privacy and Electronic Communications Regulations (PECR), that came into force in 2012.

Tetrus Telecoms was sending as many as 840,000 texts a day from unregistered, pay-as-you-go SIM cards. Tetrus Telecoms then sold on the phone numbers of any replies for between ?3 and ?5 each as a lead-generation scheme, making them between ?7,000 and ?8,000 per day.

Christopher Niebel was fined ?300,000. Gary McNeish was fined ?140,000. The ICO said McNeish had appeared to have taken less out of the business.

“The two individuals we have served penalties on today made a substantial profit from the sale of personal information,” said Information Commissioner Christopher Graham.

Together we can put an end to this unlawful industry that continues to plague our daily lives

Christopher Graham, Information Commissioner 

“They knew they were breaking the law and the trail of evidence uncovered by my office highlights the scale of their operations.”

Niebel and McNeish face prosecution for failing to notify the ICO that Tetrus Telecoms was processing personal information, a requirement of the Data Protection Act (DPA). This could lead to a fine of ?5,000 in a Magistrates Court or, if it were to be referred up to the Crown Court, an unlimited penalty.

The ICO has warned companies which bought data from Tetrus Telecoms to check whether customer consent was given or risk coming under scrutiny as well.

“We will continue to work with the relevant authorities, as well as the network providers, to ensure companies like this are punished,” said Christopher Graham.

“We’re also working with the Ministry of Justice to target claims management companies who purchase this information, breaching the industry regulations, the Data Protection Act, as well as electronic marketing regulations.”

The ICO has set up an online survey to enable customers getting spam texts or phone calls to report them directly.

“Our message to the public is that if you don’t know who sent you a text message then do not respond, otherwise your details may be used to generate profits for these unscrupulous individuals,” Graham concluded. 

“Together we can put an end to this unlawful industry that continues to plague our daily lives.” 


Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy$("a#eproductLogin").attr('href', function(i) { return $(this).attr('href') + '?fromURL=' + regFromUrl; });

Tuesday, December 4, 2012

Advanced technologies seen as aid to contact center agents

Contact center agents had it rough when an 800 number was a customer's only option. Now, with widespread use of social media and live chats, it can be even more difficult for an agent to manage problems and soothe caller frustration.

It no longer matters which of the many channels the customer chooses: Companies have to give their agents the tools to fix customer problems in every situation, industry watchers say.

Agents must have advanced contact center technologies, like desktop tools, that make information available for quick and easy use. They can't afford to waste a caller’s time -- especially an irritated one -- toggling between several channels to learn his personal information, his problems and how to solve them.

"It is about giving the agent the information he or she needs to provide personalized customer service and create a positive experience," said Pekka Porkka, general manager of SAP’s contact center software business. "It leaves the customer with the feeling that the company knows him or her and that they care."

In some cases, companies even have designated teams of up to a dozen employees to watch and respond to social media, said Paul Greenberg, president of The 56 Group, a Washington, D.C., consulting firm focused on CRM strategies.

"We’re seeing the transformation of agents," he said. "We’re seeing the addition of channels that 10 years ago weren’t there."

The customer is always … uptight?

Strong knowledge-management systems that draw on company best practices, rather than a general handbook, can help contact center agents predict or respond to customer questions, Greenberg said. With these tools, agents can use the company's own historical data and results to address customer issues.

"Not only can they bring up knowledge from knowledge bases, but context-specific knowledge," he said. "That’s really critical. Agents can find the best certified solutions to problems based not on a manual but the historical experiences of the company’s own customers and agents."

Voice analysis software helps agents determine callers' moods, alerting contact center managers to cranky customers who are then pushed to the head of the line, Greenberg said.

Real-time speech analytics can also gauge customers' feelings. If a caller is particularly upset, the software alerts a supervisor, who can then respond quickly to take the call and avoid losing a customer.

"There’s a big difference if I’m irritated or furious," Greenberg said. "When you're dealing with thousands of calls and a troubleshooting team, your job is to prioritize the calls to take. There's a lot involved in that."

Customers calling in are typically more frustrated than they have been in the past, said Brad Cleveland, a CRM consultant in Sun Valley, Idaho. They no longer want to spend time on the phone for answers to simple problems; they don't want to go through an automated call service to get an answer they could have found online with a quick Web search, he said.

"There are customer communities and self-service technologies to get what [customers] need," said Cleveland, a former president of the International Customer Management Institute. "By the time it reaches an agent, it's a pretty high-level call at that point."

Contact center agents are people too

For agents who have to deal with all those touchy customers, a number of advanced contact center technologies can make all the difference.

Donna Fluss, founder and president of DMG Consulting in West Orange, N.J., identified four popular tools that contact center agents believe make their jobs easier. They are real-time speech analytics software, real-time guidance applications, next-best-action technologies and predictive analytics. Each tool can help agents respond to queries in real time and anticipate customer behavior.

But companies won't have success with those tools until they see things from the agents' perspective and understand what their jobs entail--and then take action to make their lives easier, she said.

"Good technology in the wrong hands isn't worth being in anybody's hands," Fluss said. "It's about bringing a lot of good information together for agents to give them more directed, useful information so that they can make the most out of every conversation they're having."

Elizabeth Herrell, an analyst at Constellation Research Inc. in Monte Vista, Calif., said there are many desktop software tools on the market--from Salesforce.com and Pegasystems, for example--that do just that. Companies can then incorporate these tools into their CRM strategies and improve customer relationships by cutting down call times and satisfying more customers.

"It pushes information to the agent, instead of the agent going out there to pull it forward," Herrell said of the desktop technology. "If you have a really integrated desktop with good CRM systems, you make it a lot easier for the agent to speed up their job."

The need to get analytical

Companies that combine desktop tools with predictive analytics "can get some exciting things going on," Fluss said. With analytics, companies can find out which channels customers prefer, predict the number of contacts in those channels and better prepare their agents to handle more complex customer questions.

But those industry leaders are still rare, Herrell said.

"The shortfall of companies right now is in analytics," she said. "There are a lot of initiatives in progress, but a lot of them are separate and not accurately supported."

David Lowy, Moxie Software’s vice president of product management, said many companies also have trouble balancing productivity and quality in the contact center.

Moxie products try to address that issue with "proactive service," Lowy said, which draws from a rules-based engine that understands what a customer has been looking for and provides customer profile information. Then an agent can step in and offer live chat assistance while the customer is perusing the company’s website.

In the age of "big data," great swathes of structured and unstructured information companies are collecting, those predictive and responsive capabilities to improve productivity and quality are even more important. Microsoft, for instance, announced a partnership in November 2012 with Moxie to take advantage of its products in Microsoft's own CRM.

"We’re talking about huge amounts of knowledge that have to be analyzed and sorted and organized and presented and consumed," Greenberg said. "These are tools to do all that, and they do that in close to, if not actually, real time."


This was first published in November 2012

Forrester: Microsoft licence hike makes no sense

Windows users can expect a 15% increase in the cost of licensing key Microsoft products as the software giant raises its prices.

From December 1 2012, Microsoft will increase the price of per-user licensing of several products including Exchange, its Lync communications server, Windows Server and Terminal Services.

The change in price reflects change in usage, as more users access Windows software from several devices, rather than a single PC. But analyst Forrester believes the licence fee increase is flawed and will erode the value of its products against the likes of Google.

Forrester's Duncan Jones said Microsoft was struggling to decide how to shift from device Client Access Licences (CALs) to per-user CALs. 

“A lot of companies are on device licensing. Some have a mixture. If everyone has PC then per user makes sense, but if you are a company like a manufacturer where not all users have a PC, then per device is cheaper,” said Jones.

But per-device licensing for software is obsolete in the mobile and virtual world according to Jones. 

"It is impossible to control and not relevant to the app internet world. Gradually software companies will change licensing accordingly,” he said.

However Jones believes Microsoft’s approach to changing its client access licensing is flawed. 

He said: “I told Microsoft it was a mistake to charge more for user CALs than device CALs. It’s inside-out thinking. 

"There is no evidence that people are deriving more value from MS software and people will not be willing to pay 15% more because they can access it from a tablet.“

While Microsoft has increased the pricing of user CALs to address device proliferation, it has not added anything extra to justify the price increase. He said they should have cut the cost of device CALs rather than increase user CALs. 

“User CALs are not more valuable than device CALs. Device CALs now have less value,” said Jones.

He believes the change in pricing will affect how IT departments upgrade. If they have an Enterprise Agreement (EA), their organisation buys the right to upgrade to the newest product release, but businesses may consider they are happy with the current release. 

“Many will likely stay put on current versions,” he added, which will impact EA revenue.

In a research paper on Microsoft's licensing price increase, analyst Gartner criticised Microsoft for not informing its customers directly. The supplier has relied on software resellers to explain to customers what the price increase will mean. 

In the report, Gartner research direct Frances O'Brien noted: “Gartner agrees that some, but not all Microsoft customers may have received additional value from user-based CALs. 

"But we believe that Microsoft’s silence and lack of advance notice to its customers may prove problematic for some organisations that rush to renew early, simply to avoid a price increase. 

"These price hikes — combined with licensing changes and the inherent cost implications for SQL 2012, System Center 2012, Windows Server 2012, and the price changes alluded to in Microsoft's 1 October 2012 Product Use Rights Document for SharePoint Server 2013 and Lync Server 2013 — could add unexpected expenses to some customers' budgets.”

Phil Heap, head of products and services at Fast, the software asset management specialist, said: “This adds another layer of complexity to Microsoft licensing and it’s obviously being driven by the need for flexible and mobile working scenarios.”


Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy$("a#eproductLogin").attr('href', function(i) { return $(this).attr('href') + '?fromURL=' + regFromUrl; });

Monday, December 3, 2012

Fruit and veg distributor keeps food fresh with Infor’s M3 ERP

Greengrocer Reynolds is enjoying the fruits of an Infor ERP system that cuts down on food waste.

It is 67 years since William Reynolds started selling fruit and vegetables from a wheelbarrow at the Ridley Road Market in Hackney, in the blitzed-out East End of London. Today, his grandson, Tony Reynolds (pictured), the managing director, thinks that the IT Reynolds has applied to make its operations more efficient gives it an edge in the food distribution sector.

On 4 October 2010, Reynolds went live with Infor’s M3 enterprise resource planning system. 

“We had four hours downtime to change our whole business," said IT and business systems director Richard Calder. “It was character building”.

Since then, his four-person IT team has joined with a four-person ‘business improvement team’ in fine-tuning the M3 installation. 

“It’s like having a new engine in your car," Richard Calder said.

The company “pretty much started from scratch” when putting in a new ERP system, said Tony Reynolds, having outgrown a legacy system – bespoke for the food produce distribution sector – that they’d been using since the 1990s.

Reynolds confirmed that the legacy ERP and supply chain systems were exhausted. The management teams were unable to get the information they needed quickly enough, or in enough detail, to make confident decisions. But putting in a new core system was still a bold move.

“Competitors and distributors have massive horror stories to tell about failed ERP implementations," said Reynolds. "I felt if we could get through the M3 implementation, we would have single platform, whereas our competitors are working with bolted-on systems."

At the time, the company had a ?150m turnover, “but to get to ?200m plus, we would have to have something better in place," said Reynolds.

The family-owned firm has a 140,000ft2 temperature-controlled national distribution facility at Waltham Cross, near London, and regional depots in Manchester, Bristol, the Midlands and Scotland. It processes more than 3,000 orders daily, operates a fleet of 194 vehicles and stocks more than 2,500 different products.

Reynolds supplies ASK, Carluccios, Pizza Express, Pr?t-?-Manger, and Prezzo, as well as hotels such as Accor and Ramada Jarvis, and the Fullers pub chain with fresh fruit, salad and vegetables, cheese and other dairy products.

The Gondola Group, which owns Pizza Express and ASK, is known for its innovation with IT, using analytics to target diners, and Calder confirmed they were “very supportive” during the change Reynolds made in 2010.

They considered Microsoft Dynamics AX, IFS and JD Edwards, among others, but M3 was a “clear choice because of the functionality that we required and the fact that it was supplier-managed. 

"Resellers were not so attractive. And, at the time we were working with Lawson consultants closely," said Calder.

Infor acquired Lawson in 2011, and Calder expressed approval. 

“Infor has clearly invested heavily in M3 and the ION [Intelligent Open Network middleware] strategy opens up a huge array of improvement opportunities," he said. "For us, in the future, ION will become a key part of the business case justification because, if we want new functionality, we won’t have to pay for modification or integration work to be done.”

Calder said M3 had some specific features that made it attractive to the greengrocer. 

“The re-inspection time and sales time features enable Reynolds to reclassify and adjust shelf life at both item and lot level - delivering a reduction in the risk of product waste.

“The main thing is that we need to be able to vary the expiry date. We need the flexibility to reduce waste. We don’t have luxury of, say, 7% waste. My gut feeling is that we have achieved a 0.5% reduction in waste. That is a significant figure for us."

The avocados for Pr?t have to be right pressure for ripeness. M3 takes what we can do here to a new level

Tony Reynolds, managing director

Reynolds chipped in. “It means we can give customers better quality for longer. For example, the avocados for Pr?t have to be right pressure for ripeness. M3 takes what we can do here to a new level."

The order capture element within M3 has, said Reynolds and Calder, enabled the company to develop quick and robust order entry systems. This is important because it handles over 1,000 orders a day manually over the phone, with an average of 22 lines per order from chefs exhausted at the end of a shift. 

“They just want to place an order and go home," said Calder.

The M3 Process Flow Integrator (PFI) has put an automatic process in place to send a full email check list to the customer that they then sign off as correct. This has, estimates Calder, halved the number of order-taking errors. 

“A mistake over the phone could be a huge cost to us because we’d have to recover it the next day," he said.

Calder described how the new ERP system has changed the company’s business culture. The original project team was large, but has now dispersed into the organisation. “The knowledge is spread. Previously, knowledge was within the four walls of IT. That is not ideal. It means too much reliance on us."

There is now a business improvement team, which is the focus for “looking at the business process as a whole. We’ve moved on from a small parts approach," said Calder.

Reynolds concludes that he has “really enjoyed being able to attract graduate-level people to the business now, because of the M3 platform." He had a spell in banking and understands why working for a greengrocer might lack glamour. But the technology applied to improve operations in a manufacturing process business like his can make a difference, he said.

“From the days of my grandad’s market barrow onwards”, he said, “4 October 2010 was a red-letter day that moved us forward. I can go back to the family and say that this investment has paid dividends”.


Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy$("a#eproductLogin").attr('href', function(i) { return $(this).attr('href') + '?fromURL=' + regFromUrl; });

This was first published in November 2012

Interview: Bill Beckler, director of innovation, Lastminute.com

Since the recession hit, IT leaders have been caught in the dilemma of delivering innovation with shrinking budgets. In a recent exchange with bestselling author Thomas Friedman, Amazon founder Jeff Bezos said that to be able to do that, leaders must see themselves “not as designers but as gardeners” - that is, seeding, nurturing and cultivating good ideas coming from the team and ensuring execution.

Bill Beckler, director of innovation at online travel retailer Lastminute.com, aims to do exactly that. This month, for example, his team will be leaving its traditional office location to do product development in the lobby of the Cumberland Hotel in central London – the idea is to show prototypes to travelers and get immediate feedback on risky or raw product concepts.

“By the end of this, we will be able to build a brand new mobile app that we know travelers love because they would have already told us so,” Beckler told Computer Weekly.

“Innovating in travel is hard, because sometimes you need real travelers to tell you whether an idea stinks, whether they love it, or whether they just don't understand it,” he says.

“If we want to build a mobile app that customers love and that makes people say ‘Wow’ then we need to be testing our craziest ideas and iterating them with feedback from the marketplace. You simply cannot do that from within your office.”

Beckler added that if successful, the experience in the lobby of the Cumberland Hotel will be repeated again in other locations.

Beckler stresses the point that innovation cannot be unleashed within an organisation unless people are working in conditions that encourage creativity and pertinent ideas, adding that his job as the innovation chief of a company that pioneered in its industry sector is to “mess with people.”

“I spice things up and create an environment that fosters better ideas and ways of working. This means breaking down barriers and creating cross-functional teams, which are absolutely necessary to get anything new to happen,” Beckler says.

He points out that innovation at Lastminute.com happens partly by preventing the execution of bad ideas. But how can that be done without stifling the creation/innovation process?

“Our goal is to fail fast rather than let bad ideas occupy our development and prototyping backlog,” he says.

“To do that we get as many ideas in front of customers as frequently as possible, without investing hugely in any of them without proof of their value.”

But many CIOs simply struggle to innovate due to their inability to think outside the IT boundaries. And Beckler – an IT expert who had a range of jobs at Lastminute.com, ranging from development and marketing to operations to analytics – maintains that having a varied professional background has been key to his current role.

“My ability to bridge disciplines is my secret weapon for discovering big business opportunities,” he says.

Another key point is that producing innovative ideas doesn’t always have to be a costly process.

“In order to compete during a recession, you have to innovate just to survive. The trick for large companies is to learn from start-ups and innovate, without spending an arm and a leg,” Beckler says.

“My job is to shepherd some of our more radical business opportunities through a lean validation cycle, testing our riskiest assumptions and proving our new business models before they cost us too much. The lean message is a difficult one to teach to teams that always strive to deliver perfection, because ‘good enough’ is all we need to test an idea.”

Beckler says that recent innovations at Lastminute.com have been behind-the-scenes improvements that “make a lot of money” and that recently, the team has identified an opportunity around its Top Secret hotel product, which is still being developed, but is also expected to be very profitable.

However, Beckler says the main focus of the team right now is on mobile: “We just relaunched our mobile website for hotel shopping and we've seen a huge increase in sales in that channel. Our next step is a mobile app that travelers will love.”


Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy$("a#eproductLogin").attr('href', function(i) { return $(this).attr('href') + '?fromURL=' + regFromUrl; });

AWS re:Invent: Users hold back from putting sensitive data in the cloud

Customers of AWS public cloud are reaping the benefits of cost savings, elasticity and scalability, but many are still wary of putting sensitive applications in the cloud, despite having a “cloud-first” and “cloud-ready” strategy.

At an AWS customer panel session at Amazon’s first user conference, re:Invent, Bharat Shyam, CIO of State of Washington, the public sector organisation providing government information and services, said: “We adopted AWS and have seen benefits such as easy scalability and cost savings.”

“What we are not putting on the cloud are applications with sensitive data on it. This is because, as a public sector organisation, we are risk-averse,” Shyam said.

“It is not that the public cloud is less secure than our internal infrastructure but just that, should there be a breach, there may be allusions that it is because we did something different.”

Troy Otillio, cloud strategist at Intuit and another AWS public cloud customer, said that, despite finding efficiencies in IT with AWS cloud, Intuit still runs sensitive data on apps hosted in-house. Intuit develops financial and tax preparation software services.

“We have so far put about 12 to 15 applications on AWS but still refrain from putting any private or sensitive data yet on to the cloud,” Otillio said.

Many AWS customers – including publisher Elsevier and financial recruitment consultancy Robert Half International – said they had adopted a cloud-first or cloud-ready strategy.

“We have tried to make our internal datacentre unattractive so that our organisation adopts more cloud-based services," said Sean Perry, CIO of Robert Half International.

“Whenever we hear about hardware or virtualisation announcements such as VMware launching vBlock, we refrain from investing in them because we want to make cloud more attractive to our business,” Perry said.

Despite this cloud-first strategy and finding clear benefits of cloud computing, customer confidence in the public cloud to host sensitive services was low.

“We have a couple of physical tenants left in our datacentres for specific reasons but most of our apps are in the cloud,” Perry said.

Document management company Aconex is an AWS customer and has refrained from putting its customer data on AWS. 

“Our IT team is confident of putting sensitive data in the cloud but our customers would not allow us to," said Nock Hobden, an engineer at Aconex.

Anil Shrestha, IT manager of Verisk Health, said: “We are a healthcare service provider but the business agreement clause we have with our customer forces us to keep these apps in-house.”

Government organisations are not good at taking risk but something will tip the scales soon and users will become more confident in the future to use cloud for all types of applications, Shyam said.

But not all organisations are equally wary. Netflix has posted all its applications on the AWS cloud. 

“AWS shared white papers, which have little tricks and tips on how to manage data on the cloud, which has helped us put some of our apps with critical data on the cloud,” Perry said.

Experts explained that many users are wary of the cloud because of the fear that cloud is beyond their control.

“No matter how secure the cloud is, some data are hosted in-house for various reasons such as government regulations and compliance,” said Darren Person, CTO of Elsevier.

While AWS re-iterated the cloud security certifications it has achieved, it is the certifications and compliance of the apps that matter, Person said.

“AWS can offer all the certification they want, but if the app you put on the cloud is not certified, then there could be data loss,” Person said.

He urged users to follow the best practices of cloud computing such as encrypting data in transit as well as data at rest on the cloud.


Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy$("a#eproductLogin").attr('href', function(i) { return $(this).attr('href') + '?fromURL=' + regFromUrl; });

Sunday, December 2, 2012

CIO interview: Tony Prestedge, COO, Nationwide

Tony Prestedge is COO at Nationwide. He joined the building society halfway through a ?1bn IT transformation which he then took charge of. He talks to Computer Weekly about the progress being made, the benefits already being seen and what the future holds for the firm’s IT.

In 2008, Nationwide set out on a journey to replace ageing IT systems.

Years of under-investment meant the building society's systems had not been kept up to date, so it committed to spend ?1bn on replacing core back-end systems that had been built in-house decades ago with off-the-shelf banking software from SAP.

At the same time it chose to move its customer-facing delivery channels onto Microsoft software. It also led to Nationwide outsourcing IT for the first time (see box below: Nationwide goes outside).

The plan was to make it easier for the company to introduce new systems and at the same time cut costs. The project, which is due to complete next year, has passed six major milestones in the past 12 months.

In 2012 alone, Nationwide has virtualised its desktops, introduced a new mortgage platform across 740 branches, gone live on its SAP core banking system, completed and made available its mobile banking service, deployed systems related to its Legal & General insurance business, and completed the development of its data warehouses.

“This year has been the biggest single year of the project,” says Prestedge.

In 2008, Nationwide did not outsource any IT. Today it works with Accenture, TCS, IBM, Computacenter and BT.

The company has retained control of its IT design, but has used third parties to enable it to transform while the business runs as usual.

As it moved to suppliers, the building society has reduced its use of contractors, cutting some 200 that were working almost permanently.?? Nationwide even has 700 people in India. It predicts 60% of IT development and support will be offshore eventually.

The building society has about 500 product and infrastructure workers, 80% of which are permanent. It has 1,500 transformation and application development and support staff, of which 45% are permanent.

Prior to 2008, over 90% of Nationwide’s IT staff were in-house.

The latest milestone to be reached in the project was the launch of its mobile banking service, which is built on the new internet bank introduced in October 2011.

Over the next 12 months, Nationwide will complete the migration from its previous Unisys core system to SAP, expand from retail into business banking, and focus on meeting government regulations to make it easier for consumers to switch bank accounts. It will also focus on “taking SAP and other platforms further across the estate and regular upgrades”, says Prestedge.

Despite the turmoil which struck the banking sector in 2008, Nationwide stuck with the project, investing between ?420m and ?500m each year over the past four years. Prior to this, its annual IT spend was around ?150m.

Now the underlying technology at the building society is coming together, the project will gradually change in nature. Rather than having large projects with lots of heavy lifting, it will be more about spreading the use of new technologies within the organisation. This will give Nationwide an opportunity to further improve efficiencies and customers satisfaction, through scheduled software releases.

The company has cut its operational costs by 10% as a result of the technology already implemented. This equates to about ?40m.

But customer satisfaction is as important as anything for financial institutions. Prestedge says there has been an improvement in customer experience, according to independent research.

“The customer experience is vital,” he says, citing the three influences on customer experiences: products and pricing; channel accessibility; and payments technology. The IT underpinning the building society supports all of these.

Nationwide's application development processes, key to meeting customer demands, will be more efficient as a result of off-the-shelf software and access to outsourced development resources. It will be able to introduce new products and services quicker. With customers using different channels, such as mobile, this is essential.

Nationwide's application development processes will be more efficient as a result of off-the-shelf software and access to outsourced development resources

It is harnessing new models, demonstrated by its move to off-the-shelf software and outsourcing, and its development methods are changing to become more agile.

Prestedge says the ideal software development methods used would be more akin with how Formula One cars are designed – where computer-generated images are used and wind is simulated to test aerodynamics, for instance.

When a new app is required, he says the business should draw a picture of what it wants and the developers should work from that, rather than having how to do it outlined from the beginning.

But Prestedge says Nationwide must balance accessibility with security, and must protect the building society's identity by ensuring the technology does not turn it into a generic finance bank.

An unexpected benefit of the project is how the internal staff have taken on board what the company is doing. “We are really beginning to see real confidence from the 22,000 staff in the organisation of Nationwide’s position in the market," he says.

The building society has given itself a technology platform that will support the business and keep at least in line with competitors for a decade. Its costs are coming down, customers are happier, and staff are confident.

But as for all finance firms, challenges remain, and Nationwide will need to fully harness its new-found agility, rather than make the mistakes of the past by under-investing in IT, if it is to prosper in a tough market.


Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy$("a#eproductLogin").attr('href', function(i) { return $(this).attr('href') + '?fromURL=' + regFromUrl; });

Private cloud does not bring full benefits of cloud computing, says AWS

Amazon Web Services, the biggest public cloud service provider, has hit out at private cloud providers, claiming private cloud users are not achieving all the benefits of cloud computing at its first user conference AWS re: Invent.

Andy Jassy, senior vice-president at Amazon Web Services (AWS), told the AWS re: Invent conference that the full benefits of cloud computing are achievable only on a truly cloud-based service

“Beware of cloud washers,” Jassy (pictured) warned the 6,000+ delegates present at the conference. “Be careful of what is being offered under the name of cloud.”

According to AWS, high flexibility, agility and cost savings can not be fully achieved in a private cloud because it still involves huge amount of hardware investment.

“Those who are using private cloud are wasting millions on implementing hardware and then installing cloud-based software on top of it,” said Iain Gavin, director of the UK and EMEA business at AWS. “It is not really cloud.”

According to Gavin, building your own datacentre with private cloud capabilities involves guess-work, meaning enterprises will have to invest in hardware anticipating the capacity. 

“It is difficult to understand how much capacity an enterprise will need,” Gavin said.

“For example, Playfish, the London-based company that developed games for Facebook, anticipated that it will have about 250,000 users in its lifetime, but had about 8 million users in the first week,” Gavin said.

Playfish used AWS cloud services and was able to scale up its services to meet this unprecedented demand. 

“If they were on private cloud, scaling up to this level would have been almost impossible without incurring huge costs,” he said.

One of the hallmarks of cloud computing services is cost savings.

Jassy pointed out that traditional hardware and software providers – including IBM, HP and Oracle – focus on a high-margin, low-volume business model. According to AWS, traditional providers operate on a 60-70% margin.

“High-margin businesses may be a valid business model, but it is not one for us,” Jassy said.

With high economies of scale, Amazon can extend cost benefits to its public cloud customers, helping them benefit from the true advantages of cloud such as agility and cost savings, Jassy said.

“Traditional businesses don’t like our strategy because it disrupts their business,” Jassy said.

Nasdaq, which is one of AWS customers, said it chose AWS public cloud for its scalability and low costs. 

“If we were to use private cloud, we would not have been able to operate our IT infrastructure so cost-effectively,” Ted Myerson, global head of access services at Nasdaq transaction services, told Computer Weekly.

But not all experts agreed with AWS playing down private cloud. 

“Jassy is exaggerating public vs internal/private cloud. Many clients are seeing the benefits of internal/private cloud too,” said Kyle Hilgendorf, principal research analyst at Gartner.

But he added: “Jassy is right that the economics of cloud are not appealing to 'old guard' tech vendors.”


Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy$("a#eproductLogin").attr('href', function(i) { return $(this).attr('href') + '?fromURL=' + regFromUrl; });