Redefine the way we use the web, to unlock its potential…Web 3.0?

6 02 2010

This is something I have been thinking about for a number of years now, but more so recently with a lot of talk of HTML 5. Basically we haven’t really changed the way we use the internet (from a technical point of view) since the web became mainstream shall we say. Sure, we now use it in new ways which we hadn’t dreamed of (habits and the way we communicate with each other), but essentially the web still works the same way it always has. We use the web as content rendered as HTML that is displayed back to us in a web browser. Even if HTML 5 is the magic version and delivers so much more in terms of animation and streaming has it actually changed the way in which we use / the web works for us? No…

Let’s not go back to the good old Mainframe environment…

It seems more and more IT professionals and large organisations see the web as the new mainframe, especially when you start talking “thin client” and “cloud computing” (the cloud could be seen as our mainframe..scary). When you start looking at mainframe environments and then cloud and thin client computing, you see that the basic concepts are very similar. So what do I mean, well, all of the processing happens on a server, the machine you actually use to access it, doesn’t really have to do anything. In a mainframe environment we have dumb terminals, in the new way of thinking (trying not to laugh, sorry) we have a PC that run’s a browser (this could be a very low spec machine), and if all we did is “cloud compute” we perhaps wouldn’t need anything else?

Sure I see benefits, some of which are green, but the negatives are so obvious to see. These are essentially the same problems we have with mainframes and the same problems that lead us to using the “PC” and the “Network” to replace mainframes?

Some thin client issues?

Let me give you an example. Imagine you and I are working as designers, creating 3D computer models of pretty much anything. We may even be responsible for animating these 3D models (think something like toy story, I don’t know why, it just popped in my head). Ok, now imagine you are part of a team of say 20 working on these models, of course you are designing Buzz, someone else Woody etc. Let’s think just how much “processing” power do we need for this – just you and your requirements? The answer, quite a bit, well a lot. Now image having to times that by 20. Oh, and now let’s have that processing carried out in a “thin cloud computing environment” (of course your application is written with the fab HTML 5 which means we can do anything), which at the end of the day needs a hell of a lot of work going on at the server, oh and traffic across our network… Do you see the problems?

Well basically, even with the advances of our hardware, the server will be doing too much and things won’t go well. The system will be going slow, maybe crashing, you as a designer will be going mad with frustration, along with the rest of your team, oh and not to mention you are working to a deadline so the project manager is now going mad. Let’s throw into the mix too, that our team is distributed across the States and the UK, and some of us love using Internet Explorer, some FireFox, some even Chrome…Hmm though in theory the web is great here, it is no match to a good old client desktop, some distributed servers…

Now I know I am focusing here on a situation that doesn’t lend itself to “cloud computing” or “thin clients” but if we believe all the hype of HTML 5, cloud computing why shouldn’t we be thinking this is possible? But, as our hardware advances so does our software (though at a slower rate granted) and we as users (be us general public users or business) expect more and more performance and capabilities. So while some of our user requirements do seem to lean us toward a cloud computing way of working, soon our requirements will no doubt swing back the other way (and wont we be repeating the Mainframe and PC story all over again?)

There is an answer

The answer is pretty simple to be honest and it is something Flash showed us the way to a number of years ago when it first started popping up on the web. The answer is a mixture of the two.

So let’s start evolving how we use the web properly (not just our habits) but how it is used. The web becomes a communications network and in some ways returns to its roots. We can still use it in the way we are used to, as in we find websites and we view them in a web browser, however, those websites that aren’t just presenting us with some information, or basic shopping facilities, websites that are more “applications”, get themselves installed on the client machine. So think MS Office on the web. Why install on the client? So that the user experience is not restricted by the web architecture, nor the browser, and that “processing loads” are removed from the server and distributed back down to the client PC.

Isn’t that what Flash was doing, installed and running on the client, err years ago? Yes, and that’s why Flash has worked so well to now…The problems with Flash are not what it visually looks like, nor its basic architecture (running on the client), the problems are that it doesn’t lend itself to being able to deliver “applications”. So it is great for the web to show animations, and funky banners, slick movies etc but don’t think it will be great at delivering that 3D modelling tool we spoke about earlier…

So let’s go back to our 3D modelling requirement in the designer’s studio. In our new web world we are now working with a RIA that actually runs on the client machine, uses local storage on the machine and uses the web only for bare communications and maybe storage of files that are to be shared. All of a sudden, all of the issues with “thin client” and “cloud computing” and server loads are removed, yet essentially we are still using the web and “cloud computing” to an extent…

So the answer is RIAs that use the client processing power and that do not run in the web browser.

Is this available…

Yes it is. Since Microsoft launched its Silverlight platform (which many see only as a competitor to Flash) it has been working towards this type of scenario, where we can maximise the benefits of the PC and the benefits of the web and cloud computing. Silverlight 3 was the first version to deliver an out of the browser experience and this has been taken further with Silverlight 4, with it being able to run as a trusted application on the client machine. Oh it also runs on Mac’s and PCs and if in the browser, any browser…

Silverlight, though in some ways similar to Flash and even the old Java Applets, is a new way of using the internet, rather than us re-inventing the same way of using the web with more bells and whistles. Like flash and Java applets, Silverlight essentially runs on the client PC. Which means we can utilise its processing power to do our work, it doesn’t need to go back to the server for updates to the UI, business rules or anything like that, and it can be done there on the client machine? However, it is connected and delivered essentially through the web as a communications network, so its data and files can be easily pulled and pushed across the web and stored there. Updates to the software are also delivered through the web, with the user being able to get the latest versions of the software just by using the software itself.

At present this is all still young, but the potential is there to change our web experiences and what we realistically should be using the web for. MS Office could be delivered as nothing but a Silverlight OOB (out of browser) application, allowing us to purchase it online and using it within moments. And it would look and feel just like the version we currently have from a CD (not the slightly less functional web version). Business applications could be delivered through organisations intranets, or their “cloud providers”. Websites that provide “secure” trade or partner areas would essentially have these installed on the client machine. Twitter, Facebook and other types of highly interactive websites would be delivered as RIAs installed on the machine (there is a prototype for Facebook already built and made, which you can download and use at http://www.silverlight.net/content/samples/apps/facebookclient/sfcquickinstall.aspx). You havent used the flexibility of the web at all, if you were on a new machine and wanted to get to facebook, still visit the website where you would get prompted to install the client, which would be a simply and quick install…and away you go, back on facebook.

The future then is…

Re-defining the web as a communications network and moving RIAs out of the web browser and down onto the client. By using the web in this fashion we get a truly distributed environment that has the benefits of the web, but also the benefits of the client machine…

Advertisements

Actions

Information

5 responses

6 02 2010
Max J. Pucher

Andrew, I think this will go quite a step further than you seem imagine with this description. The future is peer-to-peer and most of the nodes will be mobile devices. The main problem of the current client/server environment you describe well, but it can already be seen that mobile devices that are fully functional without a server, have a strong problem with running multiple applications that want to share information securely across a Virtual Organization.

The solution is not Silverlight or Flash as a programming paradigm for forntends, it is not SOA (UDDI, WSDL, SOAP) because it still lacks full distributed data sharing between local apps, much less remote apps. Defined service interfaces are a nightmare to manage. What about security and social collaboration? All that is mainframe driven today. Google most of all. Facebook and LinkedIn run huge mainframes (now called server farms). The Cloud is still an illusion.

The true step foward is still to come. The trick will be to make Flash/Silverlight or any other programming paradigm irrelevant. The power ot create any app you need must be with the user and not the programmer. They just have to deliver the platform and infrastructure.

7 02 2010
Andrew Smith @onedegree

I hear you….My point though is what can happen today to move us forward, and I think that using apps that run out of the browser in this fashion is a great step forward we should be taking right now…

I think peer to peer seems like a great idea, and i love the Grid, but to make it work for everyone will be a big ask, simply due to security and that small minority of people who will use that kind of platform to create yet more viruses and spy ware. Hence I didnt go into these in my post..

Am not sure about users being able to create real applications though…cant see that one..

Thanks for your thoughts. I hope others are as forward thinking who read this post…

7 02 2010
Max J. Pucher

Andrew, you are right in terms of security. Building P2P without that in mind would be ridiculous. But if there is no development (coding) necessary to use that platform where would the viruses come from? If it would be true mobile enabled then the name of the game would be very different. At least that’s what I am working on as the future …

8 02 2010
Andrew Smith @onedegree

HI Max, its a monday morning and I cannot think how any platform can run without coding necessary to use it. I fear that I am not as visionary as yourself here and am not grasping quite what you are driving at, well from a technical point of view – i understand the aim…..

8 02 2010
Max J. Pucher

Andrew, that is understandable. Papyrus is the platform that we have been developing and selling for the last ten years. I am not trying to flog my products here but it developed from your requests for future applications. Forrester Research said that Papyrus ‘owns the most futuristic vision in content management.’

Because we are still not as far as we would like with our modeling capabilities, we still need some scripting to create an application. But we then deploy it from our application repository to run on virtually anything, including mobiles in P2P mode and very secure. It includes the users and oirganization, business objects, the business rules, user created adaptive processes, and user created frontends (RIA thin and thick client, mobile) from data mapped into widgets. The most scripting work goes into backend data integration or when users have really strange ideas for the GUI.

So, my points are not as hypothetically futuristic as they seem, because another large bank has just decided to spend two million Euros on Papyrus rather than to spend it on EMC or have their application coded by a systems integrator in Java/RIA. Our largest installations are 5000 P2P wired users and up to 25000 portal users. 2000 workflow users will be served from JUST TWO hot-backup process controllers because of the P2P mode that distributes the load.

Enough boasting (sorry for the proud father syndrome here), I hope my points are more understandable.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s




%d bloggers like this: