Those pesky hackers

In the last 10 years, cryptography researchers have demonstrated that even the most secure-seeming computer is shockingly vulnerable to attack. The time it takes a computer to store data in memory, fluctuations in its power consumption and even the noises it emits can betray information to a savvy assailant.

Attacks that use such indirect sources of information are called side-channel attacks, and the increasing popularity of cloud computing makes them an even greater threat. An attacker would have to be pretty motivated to install a device in your wall to measure your computer’s power consumption. But it’s comparatively easy to load a bit of code on a server in the cloud and eavesdrop on other applications it’s running.

Fortunately, even as they’ve been researching side-channel attacks, cryptographers have also been investigating ways of stopping them. Shafi Goldwasser, the RSA Professor of Electrical Engineering and Computer Science at MIT, and her former student Guy Rothblum, who’s now a researcher at Microsoft Research, recently posted a long report on the website of the Electronic Colloquium on Computational Complexity, describing a general approach to mitigating side-channel attacks.

This month, at the Association for Computing Machinery’s Symposium on Theory of Computing (STOC), Goldwasser and colleagues are presenting a paper demonstrating how the technique she developed with Rothblum can be adapted to protect information processed on web servers.

As well as preventing attacks on private information, Goldwasser says, the technique could also protect proprietary software so that it can’t be reverse-engineered by pirates or market competitors — an application that she, Rothblum and others described at last year’s AsiaCrypt conference.

Today, when a personal computer is in use, it’s usually running multiple programs — say, a word processor, a browser, a PDF viewer, maybe an email program or a spreadsheet program. All the programs are storing data in memory, but the laptop’s operating system won’t let any program look at the data stored by any other.

The operating systems running on servers in the cloud are no different, but a malicious program could launch a side-channel attack simply by sending its own data to memory over and over again. From the time the data storage and retrieval takes, it can infer what the other programs are doing with remarkable accuracy.

Goldwasser and Rothblum’s technique obscures the computational details of a program, whether it’s running on a laptop or a server. Their system converts a given computation into a sequence of smaller computational modules. Data fed into the first module is encrypted, and at no point during the module’s execution is it decrypted.

The still-encrypted output of the first module is fed into the second module, which encrypts it in yet a different way, and so on.

The encryption schemes and the modules are devised so that the output of the final module is exactly the output of the original computation. But the operations performed by the individual modules are entirely different.

A side-channel attacker could extract information about how the data in any given module is encrypted, but that won’t let him deduce what the sequence of modules do as a whole. The adversary can take measurements of each module, Goldwasser says, but they can’t learn anything more than they could from a black box.

The report by Goldwasser and Rothblum describes a type of compiler, a program that takes code written in a form intelligible to humans and converts it into the low-level instruction intelligible to a computer.

There, the computational modules are an abstraction: The instruction that inaugurates a new module looks no different from the instruction that concluded the last one. But in the STOC paper, the modules are executed on different servers on a network.

(With thanks to Larry Hardesty at MIT)

Samsung’s spanking new phone

The south Korean manufacturer has introduced the third generation of its flagship smartphone brand, the Galaxy S III. The phone will be branded as the official 2012 Olympics phone. It belongs to the super-phone category of smartphones which also includes the HTC One X, the forthcoming iPhone 5 and LG Optimus 2X.

According to Informa Telecoms and Media, the super-phone market will generate above 50 million units in terms of sales by the end of 2012. Galaxy S III will certainly enable Samsung to reinforce its position as the leading vendor in this market. It will also enable the consumer electronics giant to maintain its leadership as the dominant Android manufacturer, with an estimated one-third market share by end of 2012. Says Informa Telecoms and Media’s Principal Analyst, Malik Saadi:

“What is unique about the Galaxy S III is the level of intelligence Samsung has created around its embedded features and sensors which takes smartphone innovation into another league.

“The device’s features are capable of communicating with each other and sharing information, enabling it to react intuitively and automatically to an action taken by the user. For example, the phone can recognise a face in a picture taken with the camera and will associate it with a contact saved in the address book.

“The phone will then automatically save the picture in a relevant file (family, friends, colleagues), tag it, and suggest you should upload it to facebook or Twitter.”

Malik Saadi

While the Galaxy S III will be highly desirable for enthusiastic and advanced users, Samsung will have to build on the already popular Galaxy brand and push it hard to various distribution channels before the iPhone 5 is launched.

However, Samsung will find it hard to convince and educate the typical mobile phone user to adopt and use all the advanced experiences enabled by this phone. In this specific segment, Samsung Galaxy S III is unlikely to meet with great success, at least in the early stages after launch.

Continues Saadi:

“The casing of the Galaxy S III is another weakness, as it’s based on the usual plastic casing found in most of Samsung’s phones and doesn’t do justice to the device’s impressive features. Samsung needs to learn from the likes of Nokia and Apple which use high-quality materials and the best designs for casing their premium devices.

“Galaxy S III could, potentially, also cannibalise sales of some of its popular smartphones including Galaxy Note, Galaxy S II, and Galaxy Nexus. Therefore, Samsung will have to come up with a well-structured price segmentation, where Galaxy S III addresses the premium price points while the existing Galaxy devices enter the lower price points to widen the audience of the overall Galaxy brand.”

Cosworth: More than just engines

Last month’s exciting finale to the 2010 FIA Formula One World Championship marked the end of Cosworth’s first year back in the sport as an engine supplier following a three year absence. It saw the Northamptonshire engineering firm maintain its record for 100% race reliability with the Cosworth CA2010 engine.

With 1,129 laps and 5,795km racing over 19 races, the Cosworth CA2010 engines supplied to AT&T Williams, Virgin Racing, HRT and Lotus Racing amassed over 100,000kms of race weekend running. The competitiveness of the engine was also noted, helping Williams to make steady progress with the Williams Cosworth FW32 package during the season to finish the year in 6th place in the Constructors’ World Championship.

On top of Cosworth’s return, the company’s electronics division continued to develop its long standing business activity in Formula One, supplying complete solutions to both HRT and Lotus, and steering wheels to all three of the new teams. Allied to Cosworth’s provision of wind tunnel control systems, this gave significant presence to the Cosworth Group throughout the paddock.

So Mark (Gallagher – General Manager of Cosworth’s F1 Business Unit), what’s your assessment of the 2010 season?

“We achieved all our key operational objectives from an engine supply perspective, providing our four customers – one third of the grid – with a competitive, reliable and affordable Cosworth CA2010 engine supported trackside by a dedicated team of technicians embedded within the teams, and backed up in Northampton by the personnel in engineering, manufacturing, build, test and operations.

“From a business perspective the season went well; we spent money where we needed to and we achieved the profitability required to continue investing in the programme. The relationships with our customers were good, and have developed well – so we can look back on the season with a good degree of satisfaction.”

Did the engine perform as expected?

“The Cosworth CA2010 was created in a very few months using the original CA2006 as the baseline, but revising it to produce peak power within a rev limit of 18,000rpm, a much extended duty cycle of up to three full race weekends, and fuel consumption correspondent with the ban on refuelling and increased emphasis on start-weights. I believe our engineers did an outstanding job.

“The engine performed very well in pre-season dyno-testing, but we knew that once it hit the track we would need to optimise its performance more fully. The fact that pre-season testing was rain affected, and that only one of our four customers took part in all the tests, rather limited the gathering of useful data.

“Once we started racing we had a couple of issues which, while not ‘show stoppers’, necessitated some action to revise the oil system and tackle slightly higher than expected power degradation. I am pleased to say the issues were quickly identified and tackled. Obviously we had an initial pool of engines already with the teams and it took a little time to cycle rebuilds through the system to revise specifications, but at no stage was the programme compromised and the best measure of that was total race reliability.”

What about actual results on track and pure performance?

“If you work in any significant aspect of Formula One whether as a team or a key technical supplier and don’t focus on winning, then there is no point being here. As a supplier of engine and electronics technologies Cosworth plays an important part in contributing to the overall package of the teams we work with but, ultimately, the chassis, the vehicle dynamics, the aerodynamics and the myriad of other systems which go to defining a Formula One car are not within our control. We therefore focus on making sure our technology behaves absolutely to the best of its abilities.

“The results on track in relation to the new teams were very much in line with our, and their, expectations. It was always going to be a three-way battle behind the vastly experienced teams, many of which have enjoyed manufacturer support over the last decade and therefore have extensive technical facilities and resources as well as deeply experienced personnel.

“Williams gave Cosworth a much better opportunity to show our true performance and together we achieved every possible points finish in 4th – 10th places and scored that pole position in Brazil which, whilst due to conditions, was a memorable milestone.”

Speaking of Williams, how does Cosworth view 2010?

“AT&T Williams is one of the very best teams in Formula One, with enormous capability and experience. We have worked hard to ensure that the Cosworth engine contributed successfully to their overall package and at all times they have demanded from us a constant push to optimise performance. We have no problem with that; it’s a very good thing because when you add our inherent motivation to the determination of a team such as Williams, success will come.

“The package started the season with the team demanding improvements in every area, but from Valencia onwards the results started to improve with both cars making it through to Q3 on a regular basis and both Rubens (Barrichello) and Nico (Hülkenberg) scoring points. Ultimately the package finished 6th in the Constructors’ World Championship and there were an increasing number of occasions when we could outpace Mercedes in qualifying and mix it with both them and Renault in the races. There is much reason to believe we have now achieved a good platform on which to build.”

Lotus Racing won the ‘battle of the new teams’; what’s your view of their achievement?

“Tony Fernandes and Mike Gascoyne set out with a number of goals for 2010 and appear to have achieved them all, particularly in terms of being the most successful of the new teams and achieving a degree of credibility which some of their critics did not expect.

“Considering that they only received their official entry in September 2009, it was an impressive effort and I am pleased to say Cosworth engines and electronics played a key part in helping them make the grid and deliver a consistent performance.

“Unfortunately a number of issues involving their transmission system set the team on a different course in terms of seeking a new engine-transmission pairing for 2011 – but none of the reasons for their decision to switch to an alternate engine had anything to do with the performance of the CA2010. We wish them all the best for the future.”

How do you feel HRT performed in 2010?

“One of the benefits of supplying engines and electronics to teams is the extent to which you get to know them, and although HRT have come in for a lot of criticism in relation to on-track performance, I think the team pulled together incredibly well and did a very solid job all year.

“Their reliability was actually very impressive and, when one considers that it was only mid-February when Dr Kolles took over as Team Principal, in many respects their accomplishment in building the cars and competing in all 19 events against a backdrop of easy-to-make criticism, deserves reward.

“Bruno Senna, Karun Chandhok, Sakon Yamamoto and Christian Klien all did their best, trying not to create traffic problems for the truly competitive cars, and yet adding to the show for race going spectators and the audiences in their home countries. The team deserves to progress.”

Can you comment on Virgin Racing’s year and also the recent deal with Marussia Motors?

“Virgin Racing made enormous strides throughout 2010, coping with some severe reliability problems early on to achieve improved performances and ultimately real credibility as a team. Under John Booth’s direction the team never made any secret of the fact that this was going to be a learning year, and Nick Wirth’s CFD-designed VR01 acquitted itself well against Lotus and gave the team a lot to build on for next season.

“The investment by Marussia Motors is good news for the team, and also for Cosworth, as we have worked with Marussia for over a year and are currently delivering powertrains to Moscow for production of the very attractive Marussia B1 sportscar. Having two customers come together in one team gives us much to look forward to, and Marussia Virgin Racing will no doubt add new interest to the sport in Russia in addition to that already created by Vitaly Petrov and the forthcoming Russian Grand Prix in 2014.”

The first race of the 2011 season is in exactly 100 days time – how are preparations going?

“This is a very busy time of year and work on 2011 started months ago with the development of the KERS drive which is currently being tested. We are working closely with AT&T Williams, Marussia Virgin Racing and HRT to support their pre-season testing, car launches and start-of-season activities, and we expect to be running both the standard CA2010 and KERS version of the engine next season.

“We are also restructuring some of our internal systems to improve processes wherever possible, so the coming weeks will be typically hectic. We are very much looking forward to 2011.”

Finally, from a personal perspective, how has your first year at the helm of Cosworth’s F1 business gone?

“When I accepted the role here I was under no illusion that it would be a demanding job, but ultimately very rewarding. It has met both those expectations to a much greater extent than I imagined!

“I learned from my time running a championship winning team in A1GP that it is vitally important to let engineers and technicians do what they do best, empower them to get on with the job, and focus on making sure the contracts are fulfilled and the business operates profitably. I have learned a great deal too, which means that the role is always interesting; but most of all I have learned about the wider capabilities of the Cosworth Group.

“It is to my frustration that the F1 audience still views Cosworth as an ‘engine’ company when in fact we are a great deal more than that today. The electronics, aerospace and defence and automotive work that goes on here is astonishing, yet unfortunately a well-kept secret. Part of what we will be doing in the future is communicating more effectively the highly diversified engineering and manufacturing business that Cosworth represents.”

Start looking at data in new ways

 

In the last 30 years there has been a revolution in the sophistication – and to some extent the falling cost – of management information systems. But are managers any better informed today?

Senior managers still find that most of the information they get is not timely enough, not relevant, often difficult to interpret, and their information needs are not being adequately met.

Something isn’t quite working. So – what’s the problem?

Is the information bar rising faster than technology can meet it? Managers need more and more information, ever more quickly, competition is more severe and the whole pace of business has got faster. We’re on an escalator that is constantly moving.

Or has the ineffectiveness of the technology got to do with the managers themselves? That they still don’t have the skills or the mindset to make use of the tools that are there?

It was therefore with some interest that I came across Information Age’s suggestion that perhaps the answer lies in the way the human visual system processes information:

“…the full potential of the visual system not only to understand but also to analyse information is still underused by business information systems.”

Founder and Principal of Perceptual Edge – an independent analyst and consultant whose work focuses on data visualisation – Stephen Few argues that the notoriously high rate of failure among business intelligence (BI) projects can be attributed, in part, to the failure of software vendors to design their tools to reflect the way human beings perceive information.

In its piece – The Analytical Eye – Information Age goes on to say:

“Of course, top of the range BI software allows users to create all manner of charts and graphs with more effects and visual adornments than one could ever need. But this is not the same as designing a system that lets the user exploit the potential of their visual system to perform analyses they would be unable to do mathematically.”

BI technologies will of course continue to evolve as business requirements change, helping to capture and share knowledge about customers, markets, and other key areas of the business.

However, if they are to be effective, such tools will have to become more capable of mimicking the human capacity for thinking multi-dimensionally, for comprehending spatial, geographical and visual sources of information, and for making inferences.

Some words of wisdom for the UK’s IT industry

Says Director of Secure Thinking and Secure Resources, Lee Hezzlewood:

Educate staff to design and deliver solutions that are secure from the outset.

Stop Big Bang projects which almost always fail, overrun or go over budget (or all three) particularly in the public sector. Deliver projects incrementally with realistic goals and objectives.

Integrate more with the business – this needs to be approached from both sides. IT is fundamental to most businesses these days but there is still an us-and-them attitude.

Stop inventing jargon which is designed to baffle and confuse non-IT people. Speak plainly and simply!

Kitchen of the future – imagine…

According to the UN, 74 per cent of the world population will live in cities by 2050. This high level of urbanisation means that we have to rethink the way we’re using space, energy and our environment.

Those awfully clever people at Electrolux have used this scenario as a foundation and developed a brand new design concept to illustrate how we’ll live in the future. Heart of the Home is an integrated solution functioning as a kitchen table, cooking surface and bar all in one. Or as Electrolux says: “An intelligent, amorphous, interchangeable cooking surface that adapts to user needs”.

When using the Heart of the Home one simply places one’s ingredients on the surface. The appliance then analyses the ingredients and presents a list of suitable recipes. After deciding on a recipe, the user marks an area with his hand to determine how large the cooking area should be. Then the desired depth of the surface is created by simply pressing the hand against the malleable material. After achieving the required width and depth it’s just a matter of setting temperature and time with a simple touch of a finger.

Fantastic idea! Imagine never having to use pans or pots, worry whether the ingredients are fresh or look up a recipe in a cook book. All we need now is the built-in chef.

iPad…worth the wait, or lost the plot?

Apple's new iPad

Steve Jobs’ – or should I say Jony’s, Apple designer Jonathan Ive’s – long-awaited tablet device (April/May 2010 delivery – tune into the launch event in QuickTime and MPEG-4 here) has been much hyped as something we’ll use to read our books and papers on.

This may happen – The Economist takes a positive stance; it may not, especially if Fujitsu has anything to do with it. The Japanese company holds an iPad trademark. The jury’s out on whether the tablet’s screen is up to the job (no pun intended). And for some, the iPad’s price is lower than expected. This may discourage other suppliers from launching their own and scupper the market before it’s had a chance to take off.

The technical cognoscenti will no doubt be whining about the lack of a camera, support for Flash and Java, adequate battery life, etc, etc. Whether it’s the new ‘I want’, or the new ‘I can do without’. If you’re a woman, you’ll think the former – watch this very funny YouTube take from the US. And another, slightly longer but in my view even funnier take using Adolph H.

If you really want to understand where Apple’s going with iPad, ask Apple Meister Alan Kay. The real point is: Apple’s iPad is more than an iPhone with go-faster stripes on the side.

It represents refinement – of features, of use. It isn’t a revolutionary product, but a supremely evolutionary one. The epitome of product design and the nearest thing we have to a ’rounded’ computer that has to pay its way alongside a best-selling phone and music/film player.

Or, if you’re Apple fan and UK telly treasure, Mr Stephen Fry (bless ‘im):

“It’s transcendentally smooth and fast. It’s astounding. God, it’s beautiful. The display is stunning. I’m drooling with anticipation.”

New wireless technology is three years away

In the middle of nowhere and can’t use your mobile phone?

Well, finally, the independent regulator and competition authority for the UK communications industries – Ofcom – has published a discussion document to explore the potential of a new technology that could wirelessly link up different devices and offer enhanced broadband access in rural areas.

The technology works by searching for unoccupied radio waves – called white spaces – between television channels to transmit and receive wireless signals. Compared with other forms of wireless technology, such as Bluetooth and Wi-Fi, white space devices are being designed to use lower frequencies that have traditionally been reserved for tv.

Signals at these frequencies travel further and more easily through walls. This will potentially allow a new wave of technological innovation in wireless communications. Although at least three years away from commercial production, possible applications include:

  • Improved mobile broadband access in rural areas
  • Digital cameras that can automatically transmit photos back to your computer as soon as you click the shutter
  • The ability to control appliances in your home – such as the oven and central heating – hundreds of miles away.

However, white space devices must first prove they can operate without interfering with tv broadcasts and other wireless technologies that share these frequencies, such as wireless microphones. A promising solution is for devices to do this is by consulting a geolocation database that contains live information about which frequencies are free to use at their current location.

Ofcom’s discussion document focuses on the issues that need to be addressed for this solution to work. If there is strong evidence to show that white space devices can coexist with neighbouring tv signals and wireless microphones without causing interference, then Ofcom would allow them to use the frequencies without the need for individual licences. However, this technology remains largely unproven and a significant amount of work needs to be done before these claims can be tested.

The purpose of Ofcom’s discussion document is to further the thinking that is taking place around the world on geolocation and speed the development of possible solutions.