Search This Blog

Saturday, November 30, 2013

Firm launches banking app for wearable technology

The British firm Intelligent Environments has launched a new banking app for use with Pebble wearable smart devices.

This adds to the growing number of capabilities for wearable technology, which has been breaking into the business sector during the new bring your own device (BYOD) era.

wearable tech.jpg

The wearable computing trend has been gradually growing. With devices such as Pebble Watches, Google Glass and Nike FuelBands, people are able to get the information they need without having to look at their smartphone or other devices.

Despite Samsung, Pebble and Sony entering the smart-watch market space recently, Gartner says it will be an unpopular market this Christmas. Gartner suggested that this slow uptake is as a result of lack of available apps and difficulty changing streams from the health and fitness market into the wider consumer space.

"The convenience aspect of using a watch for interaction while leaving the larger-screen phone or tablet in the bag or pocket is something that users can relate to and probably recognise its value," said Annette Zimmermann, principal research analyst at Gartner. "However, there are still several significant barriers to mainstream adoption, including low interest and awareness among consumers, poor design and price."

An increase in apps such as the banking app by Intelligent Environments may help boost the sale of smart watches in the future as long as banks are willing to allow consumers to check their accounts using this method.

The app, which was set up to commemorate the 30th anniversary of first online banking system Homelink, is designed for use with the Pebble Smartwatch, as well as other Pebble devices, and will help customers to easily track their finances.

Using the app and watch combo, users can check their current balance and recent transactions, and can use the app to set up warnings such as a vibrating alert when they are near overdraft limit. The app allows users to check these details with more ease than physically travelling to a bank or taking the time to log into an online account on a PC or smartphone.

Intelligent Environments is now talking with a range of banks in the hope they will soon allow customers to use this smart technology.


Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy$("a#eproductLogin").attr('href', function(i) { return $(this).attr('href') + '?fromURL=' + regFromUrl; });

Apple acquires Kinect developer PrimeSense

After months of rumour, Apple has confirmed the acquisition of Israeli motion-sensing firm PrimeSense, which developed the gesture control feature for Microsoft’s Kinect add-on for the Xbox 360.

The second-generation Kinect hardware was developed in-house by Microsoft, making its official debut as part of the Xbox One launch. It will be released for Windows sometime in 2014, according to Neowin.

43192_Apple-logo.jpg

No financial details or reasons for the acquisition have been disclosed, but some reports said the deal was worth $360m and that Apple plans to use the technology for a new generation of products.

In particular, the deal is expected to support a 2011 patent filing by Apple for new ways to control devices without physical contact, according to the BBC.

The patent filing for "Real Time Video Process Control Using Gestures", talks about allowing users to "throw" content from one device to another using hand movements.

The acquisition comes after PrimeSense unveiled advances it has made in incorporating its technology in mobile devices such as a Google Nexus 10 tablet.

According to the company’s website, its 3D sensing technology gives digital devices "the ability to observe a scene in three dimensions" and translate that into a synchronised image stream of depth and colour.

This image stream can be used for identification of people, their body properties, movements and gestures; the classification of objects such as furniture; and the location of walls and floor.

PrimeSense has also developed a 3D scanner, which the firm says allows "anyone to scan items in their own environment and print them out on a 3D printer".

In addition to computing devices, the company says its technology can be used in TVs, interactive displays and robotics as well as applications in the retail, healthcare and industrial sectors.


Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy$("a#eproductLogin").attr('href', function(i) { return $(this).attr('href') + '?fromURL=' + regFromUrl; });

UK CERT moves to next phase with director Chris Gibson

Chris Gibson has been confirmed as the director of the UK’s new national computer emergency response team (CERT-UK), which is set to become operational in early 2014.

Francis Maude, the Minister for Cabinet Office, said Gibson brings a wealth of experience in cyber incident response in the private sector, both in the UK and internationally. 

130820_cs0481.jpg

“His first-hand knowledge and understanding of cyber security will be invaluable as he leads the national CERT,” he said.

Gibson’s former roles have included director of e-Crime at Citigroup and member of the leadership team of the international Forum of Incident Response and Security Teams (First), with the last two years as global Chair.

He is also a member of the British Bankers’ Association (BBA) Cyber Advisory Panel and for 10 years was one of Citigroup’s representatives to the Centre for the Protection of National Infrastructure’s Financial Service’s Information Exchange.

In December 2012, the government announced plans for a CERT-UK based on lessons learned from the 2012 London Olympics that were fed into the UK Cyber Security National Incident Management policy.

That policy, said Maude, sets out the importance of strengthening the UK’s response to cyber incidents. 

“By establishing CERT-UK we will build on and complement our existing CERT structures [for critical national infrastructure], which will help improve national co-ordination of cyber incidents,” he said.

CERT-UK will also act as a focus point for international sharing of technical information on cyber security and be a single point of contact for other similar teams around the world.

The new CERT will provide a core incident management response, lead international engagement and provide cyber situational awareness and information sharing for the benefit of the UK as a whole. 

Gibson said that with the implementation underway and the leadership team now being appointed, CERT-UK is entering its next phase.

“I am looking forward to the task of bringing together government, industry, law enforcement and academia to establish CERT-UK as a team of professionals forming a world-class response to cyber threats to the UK,” he said.


Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy$("a#eproductLogin").attr('href', function(i) { return $(this).attr('href') + '?fromURL=' + regFromUrl; });

Friday, November 29, 2013

Global public sector outsourcing far outstrips private sector

Global public sector organisations spent €7.4bn in the third quarter on IT outsourcing and business process outsourcing (BPO), compared with €4.6bn in the private sector, with the UK one of the top three spenders.

The study from ISG, which is its first to focus on the public sector, revealed that the UK, the US and Australia spent the most on public sector outsourcing. Globally the public sector now accounts for 56%.

Outsourcing THINKSTOCK 290x230.jpg

Looking at deals with annual contract values of at least €4m, the UK accounted for 122 deals worth €5.1bn in 2013. The rest of Europe spent less than €1.5bn in total.

“Government organisations are not that different from large corporate firms: both want to save money and operate efficiently,” said John Keppel, partner and president of ISG North Europe. “Despite pressure to control spending, governments can’t simply put infrastructure investments aside. Instead, they are relying increasingly on outsourcing to balance the need for efficient, localised services with the need to rein in costs.”

Keppel told Computer Weekly last month why it planned a report focused specifically on the public sector. "For the first time we are going to do a full year report on global public sector outsourcing. It has become more important to the market and to us."

The UK public sector is investing heavily in Business Process Outsourcing (BPO), to run pensions and social security, taxation and other e-governance initiatives, said ISG.

The public sector requires a different approach from suppliers because public sector organisations differ from businesses in how they buy. For example, ISG said service delivery reach is more important than shareholder value; geographic scope is local, not global; service integration is more important, because consortium bids are more common; procurement constraint, such as contract size and duration, differ from private sector, and there is reluctance to offshore work.

Rod Matthews, head of consultancy at local government IT user group (Socitm), said there are major changes going on in local government IT outsourcing, which is causing some “sleepless nights.”

Mathews said BPO is going through a major transformation because of technology and local authorities do not want to commit to long deals that tie them into a certain delivery model.

He said BPO suppliers will buy the software band to provide it to the local authority with services, but in the future with increased software as a service it might be less expensive for the local authorities to buy it themselves and just pay for services.

“It is hard to predict what is happening in local government IT and IT directors are feeling uncomfortable making decision as we are on the verge of something different,” added Matthews.


Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy$("a#eproductLogin").attr('href', function(i) { return $(this).attr('href') + '?fromURL=' + regFromUrl; });

Tim Berners-Lee calls for protection of basic rights online

Web inventor Tim Berners-Lee (pictured) has re-iterated warnings that the democratic nature of the internet is under threat from increased surveillance and censorship.

A firm advocate of net neutrality, Berners-Lee has been a strong critic of internet surveillance by UK and US intelligence agencies, describing the decision to crack encryption methods as “appalling and foolish”.

tim-berners-lee-Flickr-Mikel-Agirregabiria-290px.jpg

Berners-Lee has also been among the strongest opponents to proposed legislation in the UK and US aimed at censoring content and giving authorities the right to monitor electronic communications.

His latest warning came at the launch of the second edition of the World Wide Web Foundation’s annual web index report that tracks global censorship.

The report measures the World Wide Web’s contribution to development and human rights, but the latest edition says monitoring of government internet interception is inadequate in 94% of countries.

The report highlights the fact that 30% of countries block or filter political content and concludes that the current legal framework on government surveillance needs urgent review.

Berners-Lee said one of the most encouraging findings of the latest report is how the web and social media are increasingly spurring people to organise and take action to expose wrongdoing.

"But some governments are threatened by this, and a growing tide of surveillance and censorship now threatens the future of democracy,” he said.

Berners-Lee called for bold steps to be taken to protect fundamental rights to privacy and freedom of opinion and association online, according to the BBC.

According to the latest index report, which ranks countries in terms of social and political impact of the web, Sweden tops the table for the second year running, followed by the UK, US and New Zealand.

Despite the high ranking of the UK and US, both countries come in for criticism for surveillance practices.

However, the report shows that targeted censorship of web content by governments is widespread across the globe.

Moderate to extensive blocking or filtering of politically sensitive content was reported in over 30% of the countries indexed in the past year.

Legal limits on government snooping online urgently need review, the report said, with most countries failing to meet best practice standards for checks and balances on government interception of electronic communications.

However, the report said the web and social media are leading to real-world change.  The report said that in 80% of the countries studied, the web and social media had played a role in public mobilisation in the past year, and in half of these cases, had been a major catalyst.

However, the report said the rights and priorities of women are poorly served by the web in the majority of countries researched. 

Locally relevant information on topics such as sexual and reproductive health, domestic violence, and inheritance remain largely absent from the web in most countries, and only 56% of countries were assessed as allocating ‘significant’ resources to ICT training for women and men equally.


Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy$("a#eproductLogin").attr('href', function(i) { return $(this).attr('href') + '?fromURL=' + regFromUrl; });

Twitter increases protection from government snooping

Twitter has announced it is using a spin-off of the Diffie-Hellman method, first developed by GCHQ in the 1970s, to protect users' data from snooping by government intelligence agencies.

“Perfect forward secrecy” (PFS) is now live across all platforms, Twitter said, which makes it “effectively impossible” to collect data on users without the company’s permission, according to experts.

42493_Twitter-bird-logo.jpg

The move is thought to be part of a bid to make it more difficult to collect data on users without going through legal channels, according to the Telegraph.

Introduction of PFS ensures protection of encrypted data even if another party obtains decryption keys, as US and UK intelligence agencies have done in the past according to whistleblower Edward Snowden.

An internal team of security engineers has spent several months implementing PFS, which adds an extra layer of security to the widely used HTTPS encryption.

Google, Facebook, Dropbox and Tumblr have all implemented PFS, and LinkedIn is understood to be introducing it in 2014, according to the Guardian.

Technology companies and online service providers are attempting to restore user trust in the wake of the Snowden revelations of the US Prism internet surveillance programme.

The introduction of PFS means greater protection of direct private messages, protected tweets and data on what users say, who they comment on and who else they read.

PFS creates a new, disposable key for each exchange of information, so the key for every individual session would have to be decrypted to access the data.

In Elliptic Curve Diffie-Hellman (ECDHE), which supports PFS, the server’s private key is used only to sign the key exchange, preventing man-in-the-middle attacks, according to Twitter.

Ironically, the ECDHE method was first developed by GCHQ and remained classified until it was patented by US cryptographers Whitfield Diffie and Martin Hellman, who made the discovery independently.

In a blog post announcing the implementation, Twitter said PFS is what should be the “new normal” for web service owners to protect users from all predators on the internet.

“If you are a webmaster, we encourage you to implement HTTPS for your site and make it the default. If you already offer HTTPS, ensure your implementation is hardened with HTTP Strict Transport Security, secure cookies, certificate pinning, and Forward Secrecy,” the post reads.

Twitter also calls on website users to demand that the sites they use implement HTTPS to help protect privacy and to use an up-to-date web browser with the latest security improvements.

“HTTPS is surprisingly important for any web service that lets you login up front and then stay logged in indefinitely,” said Paul Ducklin, security technologist at security firm Sophos.

“That's because your logged-in status is usually dealt with by a session cookie that is used by the server to recognise a user and transmitted in the HTTP traffic,” he wrote in a blog post.

According to Ducklin, without HTTPS to encrypt the cookie between the browser and the server, an attacker could sniff the traffic, extract the cookie and use it to masquerade as a legitimate users.

Plain HTTPS only requires the server to send a user a public key to which it has a matching private key, allowing the server to use the same public-private keypair over and over again.

HTTPS with forward secrecy, however, requires the server to send you a public key that is unique to a session, so the corresponding private key can be destroyed after use, said Ducklin.

“That's how the forward secrecy is achieved: once the decryption keys from your session are destroyed, any copies of the encrypted data are effectively ‘nailed down’ into an eternally-encrypted state, like a padlock to which you've lost the key,” he said.


Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy$("a#eproductLogin").attr('href', function(i) { return $(this).attr('href') + '?fromURL=' + regFromUrl; });

Thursday, November 28, 2013

Optimising performance and security of web-based software

On-demand applications are often talked about in terms of how software suppliers should be adapting the way their software is provisioned to customers.

These days, however, the majority of on-demand applications are provided by user organisations to external users – consumers, users from customer or partner organisations, and their own employees working remotely.

ObservatoireUC.png

More direct online interaction with consumers, partners and other businesses should speed up processes and sales cycles and extend geographic reach – those organisations that do not do so will be less competitive.

But there are two big caveats. First, benefits will only be gained if the applications perform well and have a high percentage of uptime – approaching 100% in many cases. Second, any application exposed to the outside world is a security risk, vulnerable to attack either as a way into an organisation’s IT infrastructure through software vulnerabilities or to stop the application itself from running effectively – application-level denial of service (DoS) – thereby limiting a given organisation’s ability to continue with business, and often damaging its reputation as a result.

So, how does a business ensure the performance and security of its online applications?

Network traffic compression – to speed up transmission.Network connection multiplexing – making effective use of multiple network connections.Network traffic shaping – a way of reducing latency by prioritising the transmission of workload packets and ensuring quality of service (QoS).Application-layer security – the inclusion of web application firewall (WAF) capabilities to protect on-demand applications from outside attack, for example application-level denial of service (DoS).Secure sockets layer (SSL) management – acting as the landing point for encrypted traffic and managing the decryption and rules for ongoing transmission.Content switching – routing requests to different web services depending on a range of criteria, for example the language settings of a web browser or the type of device the request is coming from.Server health monitoring – ensuring servers are functioning as expected and serving up data and results that are fit for transmission.

Two things need to be achieved here. First, there needs to be a way of measuring performance. Second, there needs to be an appreciation of, and investment in, the technology that ensures and improves performance.

Testing the performance of applications before they go live can be problematic. Development and test environments are often isolated from the real world, and while user workloads can be simulated to test performance on centralised infrastructure, the real-world network connections users rely on, which are increasingly mobile ones, are harder to test. The availability of public cloud platforms helps, as runtime environments can be simulated, even if the ultimate deployment platform is an internal one. This saves an organisation having to over-invest in its own test infrastructure.

So, upfront testing is all well and good, but ultimately, the user experience needs to be monitored in real time after deployment. This is not just because it is not possible to test all scenarios before deployment, but because the load on an application can change unexpectedly due to rising user demand or other issues, especially over shared networks. User experience monitoring was the subject and title of a 2010 Quocirca report, much of which is still relevant today, but the biggest change since then has been the relentless rise in the number of mobile users.

Examples of tools for the end-to-end monitoring of the user experience, which covers both the application itself and the network impact on it, include CA’s Application Performance Management, Fluke Networks’ Visual Performance Manager, Compuware’s APM and ExtraHop Networks’ specific support for Amazon Web Services (AWS).

It is all well and good being able to monitor and measure performance, but how do you respond when it is not what it should be?

There are two issues here: First, the ability to increase the number of application instances and supporting infrastructure to support the overall workload; and second, the ability to balance that workload between these instances.

Increasing the resources available is far easier than it used to be with the virtualisation of infrastructure in-house and the availability of external infrastructure as a service (IaaS) resources. For many, deployment is now wholly on shared IaaS platforms where increased consumption of resources by a given application is simply extended across the cloud service provider’s infrastructure. This can be achieved because with many customers sharing the same resources, each will have different demands at different times.

Global providers include AWS, Rackspace, Savvis, Dimension Data and Microsoft. There are many local IT service providers (ITSPs) with cloud platforms in the UK, including Attenda, Nui Solutions, Claranet and Pulsant. Some ITSPs partner with one or more global providers to make sure they too have access to a wide range of resources for their customers.

Even those organisations that choose to keep their main deployment on-premises can benefit from the use of “cloud-bursting” – the movement of application workloads to the cloud to support surges in demand – to supplement their in-house resources. In Quocirca’s recent research report, In demand: the culture of online services provision, those organisations providing on demand applications to external users were considerably more likely to recognise the benefits of cloud-bursting than those that did not.

Being able to measure performance and having access to virtually unlimited resources to respond to it is one thing, but how do you balance the workload across them? The key technologies for achieving this are application delivery controllers (ADCs).

ADCs are basically next-generation load balancers and are proving to be fundamental building blocks for advanced application and network platforms. They enable the flexible scaling of resources as demand rises and/or falls and offload work from the servers themselves.

The best-known ADC supplier was Cisco, but Cisco recently announced it would discontinue further development of its Application Control Engine (ACE) and recommends another leading supplier’s product instead – Citrix’s NetScaler. Other suppliers include F5 – the largest dedicated ADC specialist – Riverbed, Barracuda, A10, Array Networks and Kemp.

So, you can measure performance, you have the resources to meet demand and the means to balance the workload across them, as well as offload some of the work with ADCs – but what about security?

The first thing to say about the security of online applications is you do not have to do it all yourself. Use of public infrastructure puts the onus on the service provider to ensure security up to a certain level. There are three main approaches to testing and ensuring application security: code and application scanning; manual penetration testing; and a web application firewall.

In code and application scanning, thorough scanning aims to eliminate software flaws. There are two scanning approaches: the static scanning of code or binaries before deployment; and the dynamic scanning of binaries during testing or after deployment. On-premises scanning tools have been relied on in the past – IBM and HP bought two of the main suppliers – but the use of on-demand scanning services, for example from Veracode, has become increasingly popular as the providers of such services have visibility into the tens of thousands of applications scanned on behalf of thousands of customers. Such services are often charged for on a per-application basis, so unlimited scans can be carried out, even on a daily basis. The relatively low cost of on-demand scanning services makes them affordable and scalable for all applications including non-mission critical ones.

The user experience of online applications needs to be monitored in real time after deployment

The second technique is using manual penetration testing (pen-testing). Here, specialist third parties are engaged to test the security of applications and effectiveness of defences. Because actual people are involved in the process, pen-testing is relatively expensive and only carried out periodically, so new threats may emerge between tests. Most organisations will find pen-testing unaffordable for all deployed software and is generally reserved for the most sensitive and vulnerable applications.

The third layer in web application security is to deploy a web application firewall (WAF). These are placed in front of applications to protect that from application-focused threats. They are more complex to deploy than traditional network firewalls and while affording good protection do nothing to fix the underlying flaws in software. WAFs also need to scale with traffic volumes, as more traffic means more cost. WAFs are a feature of many ADCs, and are less likely to be deployed as separate products than they were in the past. They also protect against application-level DoS where scanning and pen-testing cannot.

Complete 100% software security is never going to be guaranteed, and many organisations use multiple approaches to maximise protection. However, as one of the reasons for having demonstrable software security is to satisfy auditors, compliance bodies do not themselves mandate multiple approaches. For example, the Payment Card Industry Security Standards Council (PCI-SSC) deems code scanning to be an acceptable alternative to a WAF.


Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy$("a#eproductLogin").attr('href', function(i) { return $(this).attr('href') + '?fromURL=' + regFromUrl; });

This was first published in November 2013

Security investments reflect digital take-up in India

Indian organisations such as the government are taking information security seriously and implementing a wide range of security systems as they increasingly move online and offsite.

The Indian government is leading by example, embarking on a plan to realise its ambition of creating a workforce of 500,000 professionals skilled in cyber-security in the next five years through capacity building, skill development and training. The government plans to build and operate a 24X7 National Critical Information Infrastructure Protection Centre (NCIIPC).

Meanwhile, businesses are increasing relying on the internet for customer interactions and corporate operations

According to the findings of TechTarget research, which polled 340 India-based IT professionals, businesses are addressing a breadth of security risks through technology.

The most common security plans for the next year are initiatives around network-based security, with 50.3% of respondents planning this. Data loss prevention (47.4%) and application based security (41.2%) were the next most planned.

The proportion planning other security initiatives include: Threat detection/management (38.8%); identity and access management (34.1%); security data management/analysis (33.8%); mobile endpoint security (31.5%); virtualization security (34.1%); vulnerability management (35.3%); and encryption (34.7%).

Less than 1% (0.6%) did not plan on running initiatives in any of these areas and 8% did not know.

Nasscom chairman and Mindtree CEO, KK Krishnakumar Natarajan, said cyber security in India is far more advanced than physical security, but said it must continuously evolve to secure citizens and businesses in the light of a high number of attempted attacks.

KK Krishnakumar Natarajan said that, in contrast to its physical security, India is advanced in cyber protection

“India has to create the full ecosystem relating to cybercrime. The framework, the tools, and the assets you will use to protect data and ensure there are no risks," he said.

He said it is not just about technology. “We need to get the legal and regulatory response to attacks right,” he said.

Businesses in India are on a steep adoption curve when it comes to taking on online-based services but the infrastructure in companies is often invested in at a lower rate.

Vikram Nair, head of Europe at Tech Mahindra, said Indian businesses rely on the internet and it is a challenge to keep security up to date. 

“When I left India in 1996 the infrastructure was fixed lines that did not extend across the country but now mobile has changed things," said Vikram Nair. 

“You cannot survive as a business without the internet in India and the take-up of mobile is buoyant. But there is a lot of under-investment in the IT equipment that connects businesses to the internet, including security.”


Wednesday, November 27, 2013

More than one billion BYOD users predicted by 2018

Employee-owned smartphones and tablets used as part of bring your own device (BYOD) policies will increase to over one billion devices globally by 2018, according to a study by Juniper Research.

The report stated that this predicted figure, which accounts for 35% of all consumer mobile devices, is the result of increased mobile adoption and the realisation that mobile devices can help to improve work-life balance.

120413_009.jpg

This trend has led to growth in the mobile device security market, with Western Europe leading the sector as the largest source of revenue for sales of mobile security software.

The increase in mobile security spending is a result of more malware targeting smartphones and mobile devices over the past two years. Almost two-thirds of internet users now access the web using a mobile device, and almost 70% of online threats can damage devices or compromise mobile users’ data.

There has been a huge increase in hackers targeting mobile devices as opposed to PCs or laptops, with supplier Trend Micro recently predicting there will be more than one million mobile device-directed malware exploits by the end of 2013. Despite the increase in mobile threats, Juniper’s research found that 80% of consumer and enterprise smartphones will remain unprotected throughout 2013.

Due to an increase in awareness of mobile security, Juniper predicted that the number of mobile devices with appropriate security will increase to 1.3 billion by 2018, up from 325 million in 2013. Juniper also predicted that by 2018 more than 50% of mobile devices in the US will be equipped with mobile security apps.

Nitin Bhas, senior analyst for Juniper Research and one of the authors of the report, said: “The BYOD trend is something that CIOs and IT managers cannot ignore given the increasing number of employees bringing their own devices to the business, whether such activity is officially sanctioned or not.”


Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy$("a#eproductLogin").attr('href', function(i) { return $(this).attr('href') + '?fromURL=' + regFromUrl; });

150 subpostmasters file claims over 'faulty' Horizon accounting system

About 150 subpostmasters have filed claims to the investigation into the allegedly faulty Post Office accounting system used by thousands of subpostmaters.

The Post Office-funded investigation into the Horizon accounting system, which is being conducted by forensic expert Second Sight, had an initial deadline for claims last week (18 November).

41966_Post-Office-sign.jpg

Subpostmasters have suffered heavy fines and even jail terms as a result of alleged false accounting, but many have continuously blamed the Horizon system. The Post Office could face significant compensation claims.

Alan Bates of the Justice for Sub-postmasters Alliance (JFSA) pressure group said there have been about 150 claims that they have been wrongly blamed for accounting shortfalls. “We cannot put a final figure on it because we are still tying up loose ends.”

In July 2013 the investigation reported concerns in relation to Horizon. 

The concerns included:

Unreliable hardwareThe absence of “proper” system training and supportThe complexity of linking with a large number of other systemsA business model that puts responsibility for dealing with small system problems with sub-postmastersThe way the Post Office has in the past investigated concerns about transactions

The Horizon accounting system, used by thousands of sub-postmasters, has been blamed by many for sub-postmasters being wrongly charged and even jailed for accounting shortfalls. Others have had to make up cash discrepancies following prosecutions. The Post Office defended the Horizon system unrelentingly until recently, following pressure from the JFSA, MPs and Press.

The next phase will see some of these claims get professional support to put together detailed case reviews. Bates said it is believed that most of the claims put forward will move to this phase. Second Sight will look closer once the case reviews are put together.

Last month, former Lord Justice of Appeal, Anthony Hooper, was appointed to oversee close scrutiny of selected cases.

One claimant told Computer Weekly she has received good support from the JFSA and Second Sight.

A former subpostmaster told Computer Weekly that if the investigation proves there were faults with the Horizon system they will “shout it from the rooftops.”

Computer Weekly timeline of events:

May 2009 - Bankruptcy, prosecution and disrupted livelihoods - Postmasters tell their story 

September 2009 - Post-masters form action group after accounts shortfall 

November 2009 - Post Office theft case deferred over IT questions

February 2011 - Post Office faces legal action over alleged accounting system failures

October 2011 - 85 sub-postmasters seek legal support in claims against Post Office computer system

June 2012 - Post Office launches external review of system at centre of legal disputes Post Office launches external review of system at centre of legal disputes

January 2013 - Post Office admits that Horizon system needs more investigation

January 2013 - Post Office announces amnesty for Horizon evidence 

January 2013 - Post Office wants to get to bottom of IT system allegations 

June 2013 - Investigation into Post Office accounting system to drill down on strongest cases 

July 2013 - Post Office Horizon system investigation reveals concerns 

October 2013 - End in sight for sub-postmaster claims against Post Office's Horizon accounting system

October 2013 - Former Lord Justice of Appeal Hooper joins Post Office Horizon investigation


Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy$("a#eproductLogin").attr('href', function(i) { return $(this).attr('href') + '?fromURL=' + regFromUrl; });

Scientists banish NAS sprawl with Red Hat scale-out NAS software

The University of Reading’s meteorology department has standardised on open source storage software from Red Hat in a ?100,000 project.

The move to scale-out network-attached storage (NAS) has brought predictable performance and saved large amounts of maintenance time, compared with the chaotic results of NAS sprawl that had arisen as the department’s storage estate grew.

39662_Weather-map.jpg

The department’s data is largely the result of oceanography, atmosphere and climate modelling run on high-performance computing (HPC) systems outside the university that comes to Reading for further analysis.

The department started out as a Natural Environment Research Council (NERC) research centre at the university and all data was stored on separate NAS servers, with a variety of operating systems (OSs) including the Red Hat Linux clone, CentOS, and SUSE Linux Enterprise Server, with either ext3/4 or XFS file systems beneath them.  

By 2010, there was about 50TB held on the department’s NAS boxes and the situation had become very difficult to manage. 

“It was all in silos and a complete mix of vendors depending on the best deal we found to hold the data at the time,” said Dan Bretherton, research HPC manager in the department of meteorology at University of Reading.

At that time. the department started to have big problems with its NAS servers. There were some very large volumes of data, such as from the NEMO ocean circulation model, that would fill up entire storage servers. Often, to accommodate such data sets links were written between NAS boxes to make them look like one location.

The result, said Bretherton, was that all six NAS servers became interdependent in a really uncontrolled way. "Downtime scheduled for one server would make data unavailable from another. Performance was very unpredictable too. A processing run that took one hour on one day would take four hours or all day on another,” he said.

“We had to solve the problem,” said Bretherton. “We couldn’t afford a SAN, so had to look at a software solution that could balance across all the NAS boxes. GlusterFS [acquired by Red Hat in 2011 and rebranded Red Hat Storage Server] had all the features that we needed.”

The department deployed GlusterFS/Red Hat Storage Server on commodity x86-based NAS hardware totalling around 300TB in capacity. It is a scale-out NAS operating system that is part of Red Hat Enterprise Linux.

Scale-out or clustered NAS is a great improvement on traditional NAS, which is limited in terms of the numbers of files supported by the OS and file system, and also physically by the capacity of the NAS box. Scale-out NAS operates in a distributed fashion across many devices and can scale to very large numbers – often billions – of files.

Scale-out NAS allows users to build grids of NAS hardware instances with a global namespace so that the entire file system looks the same across all devices.

Red Hat Storage Server is overlaid onto a file system that resides on each NAS instance. XFS is the preferred file system and currently Bretherton is moving his department’s data to that format from ext3/4. 

A benefit for Bretherton is that if anything goes wrong with the Red Hat cluster, data is still accessible from the native file system on the NAS device.

The department initially deployed the community-supported GlusterFS, but according to Bretherton this was, “taking a lot of time, with community support that was a bit hit and miss”.

The benefits of Red Hat Storage Server, said Bretherton, are: “Compared to the community version we spent a lot less time maintaining it. The benefits compared to the pre-Gluster days are that it is much more predictable, with uniform performance. We can take servers out for maintenance without users knowing about it.

“I used to spend 40% of my time firefighting. Now that’s more like 15%, and I get time to spend on other things while Red Hat ticks over,” he added.

Bretherton also considered using IBM’s General Parallel File System (GPFS) as it is in use at two of the large NERC centres. 

“But, even with support from IBM the NERC centres have had to put in a lot of effort to make GPFS work effectively, and the potential for getting it horribly wrong is very real,” said Bretherton. “[The open source storage platform] Ceph would probably fall into that category as well,” he added.

Other distributed storage options – GoogleFS (GFS) and OCFS (Oracle Cluster File System) – were rejected by Bretherton because they lacked the load balancing and high-availability (HA) features that GlusterFS had. 

“Although it is possible to combine them with replication solutions like [Linux replication method] DRBD (Distributed Replicated Block Device) to provide HA that doesn't sound easy. HA works straight out of the box with GlusterFS and Red Hat storage,” he said.


Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy$("a#eproductLogin").attr('href', function(i) { return $(this).attr('href') + '?fromURL=' + regFromUrl; });

Tuesday, November 26, 2013

Racing Post warns users of website breach

The Racing Post is advising users of its website to change their passwords for other sites if they use the same one in case hackers break the encryption.

The company has promised to adopt "stringent" new measures to prevent a repeat of the weekend security breach on its website racingpost.com.

Glorious Goodwood

The Racing Post said its website was hit by a "sophisticated, sustained and aggressive" attack that compromised a database containing customer details including usernames and encrypted passwords.

The company said the risk will vary according to how much information users gave when they registered, but that no credit or debit card details are at risk.

“Betting through the site with our partner bookmakers has at all times been unaffected as this activity takes place directly with the bookmaker,” the company said in a statement on its website.

The Racing Post said it has turned off the ability to register or login to racingpost.com, making the site safe to use.

Racing Post editor Bruce Millington the attack may be part of a wider attack on a number of companies.

Lloyd Brough, cyber incident response director at information assurance firm NCC Group, said the attack appears to be a common web application vulnerability that was exploited to compromise the database.

“While it is positive Racing Post has been quick to disclose the breach, providing further technical details on what type of 'encryption' was used for the passwords would have helped further inform technical users,” he said.

According to Brough, organisations often claim encryption, where in fact they are using hashing via algorithms such as MD5 without salts or iteration counts.

“If this is the case then it is little better than using unencrypted password due to the trivial nature of recovering them,” he said.


Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy$("a#eproductLogin").attr('href', function(i) { return $(this).attr('href') + '?fromURL=' + regFromUrl; });

More than half of mobile applications in Indian organizations are internal

Over half (59%) of mobility applications being developed by Indian businesses are for employees, but CIOs must now invest in applications aimed at customers, according to Forrester Research.

Only 24% of mobility applications are for customers, with the remainder of apps developed for business partners.

Forrester studied 59 live mobile application projects from 41 organizations in India with an employee base exceeding 500. These businesses were spread across verticals such as manufacturing, financial services, automotive, media, healthcare, professional services, telecommunications, and utilities.

The research looked at how Indian organizations are leveraging mobile business applications to better connect with customers, partners, and employees.

"Indian organizations embrace mobile applications for employee enablement, but must also target external customers," said Katyayan Gupta, analyst at Forrester Research.

“Our research indicates that mobile applications will be a more critical channel for reaching consumer markets in Asia Pacific in the future compared with more developed Western markets. This is especially true in India, where the mobile Internet user base is growing at the rate of more than 30% annually, primarily due to the disproportionately young population reducing the average selling price of smartphones to below $200 and reducing the cost of data plans.”

Calculating the ROI is easier for employee-centric applications as compared to customer or business partner-centric applications, and as a result most organizations in India are giving top priority to develop applications for their employees first, said Forrester. The most common developed mobile application by Indian organizations involve sales and field worker automation.

However, Gupta suggested that CIOs should broaden their mobility focus beyond employee enablement to include a greater focus on externally focused systems of engagement that can empower individuals to take the next most likely action, which will help drive explosive growth in engagement.

According to Forrester, mobile applications will be a more critical channel to reach customers in the future in India where the mobile Internet user base is growing at the rate of more than 30% annually.

A report earlier this year from CIO Klub revealed that mobile investments are at the top of the agenda for CIOs. Shirish Gariba, president of the CIO Klub, said: "The CEO cockpit view of the business is expected to drive mobility evolution. 62% of organizations are likely to implement, upgrade or evaluate mobile applications in the coming year."


Monday, November 25, 2013

BlackBerry continues executive shakeup

Another two members of BlackBerry’s management team are being pushed out of the struggling mobile firm, it was confirmed today.

The company’s chief operating officer, Kristian Tear, and its chief marketing officer, Frank Boulben, are leaving and replacements have yet to be announced.

BlackBerry Bold - Top ten smartphones for business

It was also confirmed Roger Martin, a member of BlackBerry’s board since 2007, has resigned, although no explanation for his decision was given.

This is the latest in a raft of changes at the top following the exit of CEO Thorsten Heins at the beginning of the month. He was fired after the firm failed to find a buyer and decided to forgo the offer from its largest investor, Fairfax Financial Holdings, for $4.7bn to take over the company – choosing instead to accept a $1bn investment and to try and go it alone.

Chief financial officer, Brian Bidulka, was also given his marching orders and today his successor was announced as James Yersh, who has been senior vice president, controller and head of compliance at BlackBerry since 2008.

BlackBerry’s CEO, former Sybase boss John Chen, said the measures were needed to change the company’s fortunes.

“BlackBerry has a strong cash position and continues, by a significant margin, to be the top provider of trusted and secure mobile device management solutions to enterprise customers around the world,” he said.

“Building on this core strength, and in conjunction with these management changes, I will continue to align my senior management team and organisational structure and refine the company’s strategy to ensure we deliver the best devices, mobile security and device management through BES 10, provide multi-platform messaging solutions with BBM, and expand adoption of QNX embedded systems.”

BlackBerry is due to announce its third quarter results on 20 December.


Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy$("a#eproductLogin").attr('href', function(i) { return $(this).attr('href') + '?fromURL=' + regFromUrl; });