Pulling all the bits together

Academic computing has come a long way from the first lumbering leviathans. Now, says John Gilbey, we need to work on making everything accessible everywhere, every time

January 29, 2009

The use of information technology in the higher education community is now so deeply ingrained in our daily life that you can sometimes forget that it hasn't always been there.

Look at your daily schedule and ponder on how much of it you could achieve if, heaven forbid, we were suddenly robbed of our network resources. I tried this, and it looked pretty scary. If IT disappeared I wouldn't be able to roll out of bed and exchange views and documents with a bunch of other academics while I was still in my pyjamas - not in this town, anyway.

I've even told random passers-by when I've managed to do something that, to me, smacks of deep magic - such as when I picked up a set of documents from my university email account while standing in a Californian shopping mall, with my tiny wireless laptop balanced on a litter bin. They expressed polite, if amused, interest and hurried away.

Access to good-quality IT is now so ubiquitous in the academic community that, as with good health, we don't realise we have it until something goes wrong. Even in the family home, practically every student is used to having access to a 100-megabit local area network connected to the internet via a router that would have been considered the height of corporate sophistication only a few years ago.

ADVERTISEMENT

It is hardly surprising that each annual tide of students arrives with a higher level of expectation - which, as paying customers, they are keen to see met in full measure.

The supply of a robust internet service is firmly established as a utility in the minds of the academic community, and its delivery as part of daily life is no more surprising than the lights working or the toilet flushing. It isn't, in itself, enough to impress any more - you need to add extra value.

ADVERTISEMENT

The clever part of service delivery in this environment lies in being able to second-guess what the next "Big Thing" in networked services is going to be in time to prepare the budget, plan the implementation and deliver the project before the potential customers have moved elsewhere to get the service.

Think about the rise of social networking, for example. Walk around any open-access workstation room in a UK university and you are likely to see at least half the students connected to one or more sites dealing with the minutiae of lifestyle and interpersonal communication - mostly involving pictures taken in the pub the night before. With our core demographic so bound up with - I'm tempted to say addicted to - this environment, it was inevitable that universities would need to allow access to these sites, despite the radical shift in security policy and access philosophy that was often demanded.

The payback is that we are now seeing effective web-based academic services that use similar tools and interfaces.

Then there are the intrinsically technology-driven developments such as wireless networking, which has allowed mobile working to become deeply entrenched and hugely popular in a comparatively short time frame. Similarly, the concept of needing to hold corporate data within your own immediate environment is weakening.

Services hosted remotely by commercial companies are becoming popular in academic environments: they can be a quick and effective alternative to developing specialised systems in-house. In consequence, the old picture of a university network as a mighty citadel defended by the portcullis of a single firewall has ceased to reflect reality. Information security now needs a high degree of granularity and a clear understanding of specific requirements.

All these services demand access to significant resources, both human and financial, if they are going to be delivered in a way that satisfies the needs of both the customer and the organisation. For example, there will inevitably be a lead time involved in new technologies while staff get up to speed with new products and devise ways of matching them to the corporate computing environment - unless, that is, you can go outside for specialist help.

IT service delivery has, in some ways, simplified radically over the past decade or so. Thanks in some measure to the pragmatic requirements of internet access, there has been a high degree of global standardisation on the connectivity of systems - giving us networks where PCs, Macs and Unix systems share network protocols, access to services and file stores in a way that would have made IT managers of 20 years ago think they had died and gone to heaven.

But the use and promotion of standards is a dangerous concept to try to sell in academia. It smacks of control, restriction and attacks on academic freedom. In IT, standards mean exactly the opposite - they provide universal connectivity, they liberate you from proprietary lock-in and they provide you with a huge pool of potential collaborators who are just a keystroke away. In short, the more you simplify and standardise your IT provision, the better placed you are to take advantage of new developments as they arise.

ADVERTISEMENT
ADVERTISEMENT

The one thing that we can be sure about, apart from the assumption that we are going to be desperately short of money over the coming years, is that all the IT solutions we build today are destined to be temporary. We know that they will be replaced, made obsolete or otherwise discarded. Once we recognise the implications of this, the way forward should become much clearer.

As part of this approach, I believe we need to take a step back and look at the bigger picture of academic computing services. Why? Simply because the existing core model is in essence the same as when the UK academic network was developed, when I was still typing my data on to paper tape.

Universities run quasi-autonomous local networks to provide connectivity and services. These are connected to a national network infrastructure that offers access to specialised facilities and wider resources.

This is a well-established model and it has scaled well up until now, as the various generations of services have proven. My question is whether this will remain the best model as we go forward. If we were building the environment today from scratch, is this the model we would choose? Perhaps, but perhaps not: in a world where so much of IT provision has become standard, I'd suggest that a much greater degree of uniformity could now be developed across the UK academic community in terms of standard service provision.

Think, for example, of the staff hours that could be saved if every academic desktop were built to conform to a co-ordinated set of requirements, rather than each organisation, and in some cases each department or individual, deciding to build their own.

With a truly integrated academic infrastructure, students could arrive at university, register and immediately have access to all the software, tools and information resources that have been defined for their personal mix of units. Your desktop environment, files and resources would follow you around the world via both wired and mobile systems, finally delivering fully the dream of "everything, everywhere, every time" like a virtual learning environment on steroids.

It is important to note that I am not arguing that a single monolithic solution for all academic computing is feasible or even desirable. Particular research projects and environments will always require specialised solutions, but the ability of local service groups to respond to these is tied directly to the degree to which they can manage the mass-market requirement. Remove the drudgery of mindless routine support activity and you free highly skilled resources for work of higher value.

Yes, I know that much of this can already be done. But to gain the greatest benefit from such integration, as well as new tools and technologies, I believe we need to do some serious house-clearing. We need to be prepared to throw out elderly, bespoke and home-brewed solutions that - although useful and innovative in their day - now often stand in the way of delivering new services on time and on budget.

Inevitably, there are potential problems with this approach. Large public-sector IT projects in the UK have a reputation for lack of control, overspending and non-delivery, and the ownership and security of the environment would need to be carefully managed. But what I am suggesting is in essence a review of philosophy and the more widespread adoption of well-understood industry practices based on accepted management standards, with the actual changes to be made incrementally over time as resources allow.

For this integration and simplification to succeed, the tripartite information flow between senior management, customers and service providers across the community needs to be effective, definitive and open. Without commitment and engagement from all three groups, we will fail to get the maximum possible benefit from the next round of information innovations - which is something none of us can afford to have happen.

ADVERTISEMENT

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Sponsored

ADVERTISEMENT