The future of the web? Apps all the way…

11 02 2011

This year will be the first year it is believed, that web access will be carried out on more mobile devices than actually through a PC or laptop. That’s a massive shift in the way we use the web. But don’t think that means we are sticking with browsers and HTML 5 even. What it really means is that more of us are looking towards mobile apps for access…

Take an example, do you from your mobile device use Tesco website for your shopping, or do you use their app. Almost everyone will say the app (if you shop at Tesco via the web that is). So why do we use the app and not the website? Simple, user experience…

Apps User experience

The problem with mobile devices is the screen real estate, they are simply small, even when you use an iPad, the real estate is smaller than a traditional netbook or my 19” wide screen TFT monitor…So seeing everything can be tricky, and it means scrolling around a lot. Secondly is the experience, waiting for pages to load over the web etc etc.

Apps provide a more “desktop” type experience, often loading is done in the background or even core data is stored on the device. So that means performance is greatly improved and we don’t have to pay greater network charges. In addition, apps are designed specifically for the realestate problem, so we get nice smooth experiences which make browsing using a web browser pale in comparison…

What can we learn from this…

What we learn is that, HTML 5 may be the future of websites and even rich internet experiences on the web and to some extent mobile devices, but the future is still on the device itself. Running software via a browser is architecturally inefficient; it’s very restrictive and comes with no end of issues. That’s simply because the web was not designed to deliver applications, rather it was designed to deliver content.

Can we deliver “apps” to the desktop? Yes we can. This is something I am a strong believer in. The web is great for delivering content and communications between the client and a server. If we make the small leap that components of an application are content, then we see that we can deliver desktop apps down to the client via a website, and have them communicate with servers in the cloud over HTTP. This is why I love the Silverlight model, as it’s all there…

Delivering applications this way makes the most of the web architecture and leverages all the benefits of being on the desktop, just as “mobile apps” make the most of being on the device. This is a great way of delivering real applications to business users, either over the web or intranet, running them out of the browser. You have a desktop app, with all the flexibility of a web app. A great solution….

Facebook scenario?

I’m not saying this is where we should all be, but websites such as Facebook would benefit massively from having a desktop app version. Why? Well how many people do you hear actually compliment the Facebook website on its looks, feel and how they use it? I don’t think any, rather I hear constant moaning about its performance, lack of intuitive navigation and, well the list goes on. The only good point is that they can access it over the web. But, how many use Facebook the website on their phone? Hardly any, rather they opt for their devices Facebook app (which delivers a better experience than the website most times). So if you had the choice as an end user, would you  have a rich desktop app for Facebook, rather than having to go to the website? I know I would!

Silverlight and Flash can deliver those capabilities, HTML 5 cannot. I think the future should be HTML 5 for websites, Silverlight and or Flash for desktop “web” apps…

Advertisements




Redefine the way we use the web, to unlock its potential…Web 3.0?

6 02 2010

This is something I have been thinking about for a number of years now, but more so recently with a lot of talk of HTML 5. Basically we haven’t really changed the way we use the internet (from a technical point of view) since the web became mainstream shall we say. Sure, we now use it in new ways which we hadn’t dreamed of (habits and the way we communicate with each other), but essentially the web still works the same way it always has. We use the web as content rendered as HTML that is displayed back to us in a web browser. Even if HTML 5 is the magic version and delivers so much more in terms of animation and streaming has it actually changed the way in which we use / the web works for us? No…

Let’s not go back to the good old Mainframe environment…

It seems more and more IT professionals and large organisations see the web as the new mainframe, especially when you start talking “thin client” and “cloud computing” (the cloud could be seen as our mainframe..scary). When you start looking at mainframe environments and then cloud and thin client computing, you see that the basic concepts are very similar. So what do I mean, well, all of the processing happens on a server, the machine you actually use to access it, doesn’t really have to do anything. In a mainframe environment we have dumb terminals, in the new way of thinking (trying not to laugh, sorry) we have a PC that run’s a browser (this could be a very low spec machine), and if all we did is “cloud compute” we perhaps wouldn’t need anything else?

Sure I see benefits, some of which are green, but the negatives are so obvious to see. These are essentially the same problems we have with mainframes and the same problems that lead us to using the “PC” and the “Network” to replace mainframes?

Some thin client issues?

Let me give you an example. Imagine you and I are working as designers, creating 3D computer models of pretty much anything. We may even be responsible for animating these 3D models (think something like toy story, I don’t know why, it just popped in my head). Ok, now imagine you are part of a team of say 20 working on these models, of course you are designing Buzz, someone else Woody etc. Let’s think just how much “processing” power do we need for this – just you and your requirements? The answer, quite a bit, well a lot. Now image having to times that by 20. Oh, and now let’s have that processing carried out in a “thin cloud computing environment” (of course your application is written with the fab HTML 5 which means we can do anything), which at the end of the day needs a hell of a lot of work going on at the server, oh and traffic across our network… Do you see the problems?

Well basically, even with the advances of our hardware, the server will be doing too much and things won’t go well. The system will be going slow, maybe crashing, you as a designer will be going mad with frustration, along with the rest of your team, oh and not to mention you are working to a deadline so the project manager is now going mad. Let’s throw into the mix too, that our team is distributed across the States and the UK, and some of us love using Internet Explorer, some FireFox, some even Chrome…Hmm though in theory the web is great here, it is no match to a good old client desktop, some distributed servers…

Now I know I am focusing here on a situation that doesn’t lend itself to “cloud computing” or “thin clients” but if we believe all the hype of HTML 5, cloud computing why shouldn’t we be thinking this is possible? But, as our hardware advances so does our software (though at a slower rate granted) and we as users (be us general public users or business) expect more and more performance and capabilities. So while some of our user requirements do seem to lean us toward a cloud computing way of working, soon our requirements will no doubt swing back the other way (and wont we be repeating the Mainframe and PC story all over again?)

There is an answer

The answer is pretty simple to be honest and it is something Flash showed us the way to a number of years ago when it first started popping up on the web. The answer is a mixture of the two.

So let’s start evolving how we use the web properly (not just our habits) but how it is used. The web becomes a communications network and in some ways returns to its roots. We can still use it in the way we are used to, as in we find websites and we view them in a web browser, however, those websites that aren’t just presenting us with some information, or basic shopping facilities, websites that are more “applications”, get themselves installed on the client machine. So think MS Office on the web. Why install on the client? So that the user experience is not restricted by the web architecture, nor the browser, and that “processing loads” are removed from the server and distributed back down to the client PC.

Isn’t that what Flash was doing, installed and running on the client, err years ago? Yes, and that’s why Flash has worked so well to now…The problems with Flash are not what it visually looks like, nor its basic architecture (running on the client), the problems are that it doesn’t lend itself to being able to deliver “applications”. So it is great for the web to show animations, and funky banners, slick movies etc but don’t think it will be great at delivering that 3D modelling tool we spoke about earlier…

So let’s go back to our 3D modelling requirement in the designer’s studio. In our new web world we are now working with a RIA that actually runs on the client machine, uses local storage on the machine and uses the web only for bare communications and maybe storage of files that are to be shared. All of a sudden, all of the issues with “thin client” and “cloud computing” and server loads are removed, yet essentially we are still using the web and “cloud computing” to an extent…

So the answer is RIAs that use the client processing power and that do not run in the web browser.

Is this available…

Yes it is. Since Microsoft launched its Silverlight platform (which many see only as a competitor to Flash) it has been working towards this type of scenario, where we can maximise the benefits of the PC and the benefits of the web and cloud computing. Silverlight 3 was the first version to deliver an out of the browser experience and this has been taken further with Silverlight 4, with it being able to run as a trusted application on the client machine. Oh it also runs on Mac’s and PCs and if in the browser, any browser…

Silverlight, though in some ways similar to Flash and even the old Java Applets, is a new way of using the internet, rather than us re-inventing the same way of using the web with more bells and whistles. Like flash and Java applets, Silverlight essentially runs on the client PC. Which means we can utilise its processing power to do our work, it doesn’t need to go back to the server for updates to the UI, business rules or anything like that, and it can be done there on the client machine? However, it is connected and delivered essentially through the web as a communications network, so its data and files can be easily pulled and pushed across the web and stored there. Updates to the software are also delivered through the web, with the user being able to get the latest versions of the software just by using the software itself.

At present this is all still young, but the potential is there to change our web experiences and what we realistically should be using the web for. MS Office could be delivered as nothing but a Silverlight OOB (out of browser) application, allowing us to purchase it online and using it within moments. And it would look and feel just like the version we currently have from a CD (not the slightly less functional web version). Business applications could be delivered through organisations intranets, or their “cloud providers”. Websites that provide “secure” trade or partner areas would essentially have these installed on the client machine. Twitter, Facebook and other types of highly interactive websites would be delivered as RIAs installed on the machine (there is a prototype for Facebook already built and made, which you can download and use at http://www.silverlight.net/content/samples/apps/facebookclient/sfcquickinstall.aspx). You havent used the flexibility of the web at all, if you were on a new machine and wanted to get to facebook, still visit the website where you would get prompted to install the client, which would be a simply and quick install…and away you go, back on facebook.

The future then is…

Re-defining the web as a communications network and moving RIAs out of the web browser and down onto the client. By using the web in this fashion we get a truly distributed environment that has the benefits of the web, but also the benefits of the client machine…





Social Media needs moderation

30 07 2009

Social Media is a great way of engaging the public, getting involved with conversations and enhancing any online presence you may have. However, like all things open to the general public, it can be open to abuse.

There has been a lot of discussion on Twitter today about such abuse, mainly regarding spammers and “bots” (automated robot type applications) but also the actions of a minority number of actual users. You see, Twitter, like all social based websites, is open to abuse from anyone or anything that can get an account open. With today’s APIs and concept of sharing, it’s even easier for spammers to set up applications that latch onto people, discussions and basically hijack conversations going on sending out their load of rubbish to anyone and everyone…

Add to this that small number of people who seem to use Social Media to be abusive (just spend a little time on You Tube reading comments and you will see what I mean), you can see why large numbers of genuine users of Social Media get hacked off.

This is something we just need to put up with

Now this is a statement I hear far too often. Or alternatively we read something along the lines of “we provide users with tools that can combat abusive users”. The latter is true, on Twitter I can block someone if I feel they are abusive, I can also report a post as abusive on You Tube for example. However, how many of us actually take the time to help moderate? It also doesn’t help me with filtering out the amount of Spam I have to shift through when looking at a trending topic on twitter, or the amount of silly abusive comments I have to read on You Tube before I get to see something valid.

Websites that allow customer feedback are always prone to such issues, however, many of these (and I strongly suggest all businesses do this), moderate and check peoples posts before allowing them to be published to the world. I know this can be time consuming, but with a good business process behind this, it can be quicker and easier than you think.

Make it harder

Simple basics make a great difference. I am always surprised how many basic security features, or basic business common sense is missing with Social media sites. For far too long Social Media websites have been caught up purely with increasing the number of users that use their website. This drive for numbers has always been at the expense of security and funny enough, the ability to actually make money (the latter is a different post).

So what things can social media websites do to make it harder for abusive users and spammers?

First off, why do Social Media sites not always authenticate a genuine user? Let’s check that someone is actually at that web address and make them follow some instructions before allowing them to open an account. Let’s get some information including their IP address.

Secondly, let’s follow their first “x” interactions (tweets for twitter, status updates in Facebook etc), monitoring them for obvious Spamming / abusive activities. This could be seen as a probation period. This isn’t hard to set up though would require a human element at some point.

Thirdly, let’s set up some rules to at least try to flag content that may be viewed as abusive or again as Spamming activities. If possible let’s have a moderation business process in place so that as many as possible posts can be checked and moderated before being made public (I can see this wouldn’t work on Twitter)

Fourthly, if someone is reported for any abuse (spamming, abusive messages etc) lets investigate these claims and if true, ensure that account is banned and all content removed. If we have their IP address, lets see if we can follow up this user using this, maybe inform the users ISP?

Finally (well for this small list), lets monitor trending topics (Twitter specific) for Spam. Once something gets close to the top 10, why not increase monitoring or employ a human to keep an eye on this.

Conclusion?

At the end of the day, spammers and a small number of people / businesses with poor etiquette, have ruined the concept of mailing lists for eMail marketing. They now threaten to drown out valid content from within the Social Media sphere. Websites need to try to protect us, the users, against this behaviour. Its something they should have addressed from day dot, but since they haven’t, they need to address it as a matter of urgency…Facebook, Twitter, listen!

Lets try to ensure Spammers and the abusive few don’t ruin Social Media and destroy its potential…





SEO, it’s not for techies!

27 07 2009

Search Engine Optimisation (SEO) is often seen by companies as a task for web developers, or specialised companies. You don’t have to spend long looking on the web to find techie individuals and techie companies offering such services, some making claims that they will get your site on the first page of Google, Bing and Yahoo….The point is, SEO is NOT something you should be asking your IT guys to provide, and it’s for certain not something you should be going to a “specialised” company for…

There is a common belief, which simply is wrong, that SEO is a highly specialised technical field. In the past there maybe some truth behind this as web development companies looked for ways to “trick” search engines in ranking websites. However, Google, Bing, Yahoo etc aren’t silly, they soon caught on to these tricks and in some cases (BMW) ban websites from being ranked.

The truth is, SEO has almost nothing to do with web development, and is not a technical field, rather it is a highly important part of your organisations communications and PR. In this post I will try to explain why…

The techie part of SEO

There are some aspects of SEO that are technical, and it is these aspects that instantly make business believe the SEO is for the techie guys. The technical part of SEO though is very small, and is simply as follows:

  • Websites need to have “Tags” set up correctly within them
  • Websites should be structured in a “search engine friendly” fashion

So what do these two things actually mean? Well let’s have a quick look:

  1. Websites need to have tags: Your web page should have at least 3 “meta tags” set up. These tags are just like normal html tags, however one holds the title of the page (this is shown in the bar across the top of your web browser), one the description (this is a description of the content on this page) and finally, one the key words associated with your web page (these help search engines associate search words with your web page). Please note this really isn’t technical, anyone with access to a web page can set these up with two mins of training…
  2. Websites should be structured in a search engine friendly way: Well your web page should not have un-necessary content cluttering it up. This could be actual functions and code that is present to help create the “style” of the website. Such code and style information should be linked to in separate files. The webpage should also be structured so that the web page links can easily be found and content navigated to. If your web site conforms to W3C standards (and it should for accessibility and compliance) then your website structure is already there for SEO.

For sure, you need a technical person to be structuring and implementing your website. However, they don’t write your content, so why expect them to realise your SEO needs. Techies should be used only to implement SEO for you.

 

If SEO is not technical, what is it really?

Well to put it simply, those two technical points just make the site accessible and understandable to search engines, nothing more. SEO is about optimising communications and relationships, by that I mean optimising your website content and your online relationship with your customers / market place.

If you think about it, the way in which we use search engines is very simple, we simply type in what we think will bring back our desired results. So if I want to find a tennis shoe, I will no doubt type in Tennis Shoe or Shoes for example. The search engine is, to put it in very simple terms, using our search words or phrase and matching it to sites that mention tennis shoes. This means that to optimise your website for a search engine, you are really optimising your website for what a customer may search for, the way in which your customer is trying to find and communicate with your website.

This means that SEO has very little to do with technical aspects, but a lot to do with communications and PR.

 

Factors for good SEO

There are of course factors for good SEO, all of which sit with your communications and PR teams. So what are these factors? Well here is a list of some of the main contributors to good SEO, but by no means the complete story:

  1. Content is king. Your website content has to be well written, clear, concise and relate to peoples searching behaviour. So don’t fall for some SEO consultancy or web techie guy saying they can help. The only help you can get is from Communications / PR!
  2. Your keywords have to marry up with words customers may use to find your business. So you need to understand your customers and how they communicate with you
  3. Keywords aren’t just words you place in a tag at the top of the web page, they are words that should be found in the content of your web page
  4. Don’t dilute your keywords and content by adding everything under the sun. Ensure your website and content stays as focused as possible
  5. Your Social Media contribution / campaigns (Social Media has a very large part to play in building relationships and getting your website out there. This means it has a big part to play in your SEO ambitions)
  6. In-bound web links (this is still very important, the more websites that have relevant content to your own, that link to you, the more chances you have of improving your SEO success)
  7. Length of time you have been online (this is actually quite important. We still find websites that increase to rank higher the longer they have been around. So don’t expect your brand new website to displace anyone within days!)

 

Conclusion

Well it’s simple, understand what you actually want to achieve with your website and how it will be found in a search engine. By doing this you realise that the technical aspects are all relating to the implementation of the website, nothing to do with its content and very little to do with how customers find you on the web.

A website is essentially an extension of your organisations communications, public relations and sales. This means that SEO starts and stops with your communications / PR teams. Once businesses realise this, they will stop wasting money on SEO with so called specialist companies and get the right people for the job.

If you are looking for SEO services, talk to companies that specialise in Communications and or PR, alternatively web companies that use specialised people from these fields. Of course, little plug here, you can always use OD Media Alternatively, speak to a specialised Communications and PR company such as  GBC or GBC Chocolate





Do we need a web browser?

17 06 2009

There have been a lot of discussions I have seen floating around on Twitter etc with regards to HTML 5, and will it kill Flash and Silverlight. To be honest, there is no way this can happen, simply because both Flash and Silverlight do not rely on a third party to make them work. In addition neither has to conform to a generic standard which can hinder their functionality. Both have product roadmaps and both move forward at a rate that such a generic implementation could never hope to achieve. This means, the user experience will always be (potentially) better, and that’s the main aim.

However, both Flash and Silverlight based web experiences do rely on a browser. A browser has to be used by the end user to locate the web site, and then for the Silverlight / Flash plug-in to be executed. After that, the browser is pretty much redundant…

In the beginning

In the beginning of the Internet, a browser was simply used to locate, access and display basic documents, that were formatted in a particular way in which the browser would understand. (I know, I am making this very simple, but I want everyone to see where I am going with today’s post). This allowed people to access these documents that were stored somewhere and read them. If you think of a browser as Microsoft Word for example, and the HTML as the actual document, you start to see where I am coming from…

Browser wars…

Jumping forward, and into the web as it was a few years ago (before social media, videos etc),  the browser started to become an integral way of accessing content on the internet. Using HTML format for the documents, the browser allowed users to use an address to find that content, then interact with it (move around the website etc). Now this is all fine, if you have one browser, or a set of hard and fast rule of standards that everyone conforms too. But we don’t, in practice that is…

There are many browsers out there, which essentially have the primary of displaying HTML content to you, the user. However, as users we want more. We want to have options to store favourites, access feeds, personalise my browser etc etc. We also want websites to do “things”. We don’t want to just read content. So what we end up with is companies fighting for us to use their browser, which in turn turns into a bit of a nightmare for web developers as their supposed standardised HTML gets displayed differently in different browsers. Worse than this, some functions just simply don’t work in some browsers…

Does browser wars actually help end users?

Old way of thinking…

For me the web has moved on. We are already saying goodbye to web 2.0, and some smart person will term web 3.0 before long (which will actually mean nothing different to web 2.0 or even web 1.0…) my point is, the web hasn’t changed its implementation, only we as users have changed the way we use the web and what we expect from the web.

The concept of using a third party application to access content on the web is old. I don’t like it at all. I also think that using HTML or any standardised format to deliver applications is plainly wrong. As a developer you are always being “shoe horned” into a way of thinking and working which hinders the application look, feel, interaction, and therefore detracts from your users experience.

Internet websites are no longer formatted pages of information; many now act as applications and with Flash and Silverlight, deliver highly rich, interactive user experiences. With such websites, the browser is simply used to find the RIA (rich internet application) and start it. The application isn’t run by the browser at all. So do we need a browser for this?

HTML 5 is supposed to deliver the ability to show video for example. However, the same issues will still apply between browsers and websites; they will just now be even more complicated.

A new way of using the web

In my own mind, HTML should remain as it is today, however, with standards (especially regarding CSS) tightened. HTML is fine at delivering content, that’s after all what it was designed for. However, delivering complete websites, rich user experiences should be left to bespoke software, such as Flash and Silverlight. This form of distributed computing power helps the end user, and enriches their experience. I see no place for a browser on my machine, and would rather see the ability to browse the web as part of the underlying operating system.

Websites can then be developed in whatever technology they require, such as Silverlight or Flash. These technologies then display the website / application as they should. The web is used to provide access and download the application / content, no need for a browser…

I hear some of you crying at this point “how will a search engine pick up the content”, which is a good point. However, search engines must adapt. Why can they not interact with Flash and Silverlight? With the latter, the content essentially is stored as xml, so it’s not a massive leap. Also, what’s stopping search engines from picking up on tags that describe the content fully, still within the hosting HTML?

HTML shouldn’t be seen as just something a browser understands, rather a format the operating system itself understands. Once this happens, and we use the web to distribute applications and information in this fashion, many of the headaches of the web will be removed, and we can truly open up the potential of distributed and mobile applications / rich experiences…Silverlight 3.0 already delivers an out of browser experience, so are we far away from this ideal?





A worrying trend in IT

15 04 2009

You don’t have to look too hard on the internet to find businesses giving away solutions and services. I can’t think of any other industry where businesses actively choose to make a loss. For me, I find it hard to comprehend that individuals would offer their services for nothing, but when businesses start doing it, I fear for the IT industry as a whole…

Why offer software and services for free?

Well it seems to me to have all started with Search Engines. Obviously you can’t have people paying to use a search engine, however, how does a company that provides a highly valued service, do so for free, and bare the implementation costs? What business argument is there behind that?

Well the only argument is that of users. The more users you have using your service / software the more “value” it must have. Now turning that “value” into actual cash, proves to be a sticky area.

Money options?

So how does a company with millions of users, such as Google or Facebook actually turn these users into some form of cash flow?

Well, in the case of Google, Advertisements is the way forward. Advertisers pay to have a potential audience the size of that provided by Google. In addition, the actual advertisement cost is so small compared to that of TV, Radio or News papers. For companies and Google, this is a great arrangement, which is only possible because of the number of users Google receives.

Now this currently works well for Google, however is it the silver bullet for all? Probably not. It is hard to place a value on advertisements, and it’s even harder to place a value on an advert placed within another website, such as Facebook or Twitter.

Worrying trends

In the fight for users, companies are starting to provide software solutions for free. Cloud Computing services and access to free web based applications mean users are now questioning why they would pay for something, when they could get it for free. This undervalues so many solutions and services provided by IT companies, bringing such organisations under increasing pressure.

With the current economic trend, many small organisations are offering services at such vastly reduced rates that there is no way they can actually make a profit. Worryingly some IT developers and organisations feel this is the only way in which to win work.

Beware…

While the “users” model works for the likes of Google, the chances of it working for even big players, such as Facebook and Twitter are still uncertain. The main issue is, and will always be, that your business model is very volatile and relies on confidence of you keeping your market share of users. If you start to lose users, you will find your business will unravel far quicker than those businesses built on a more traditional model.

Not only is the quest for “users” highly risky, it is also damaging to the IT industry as a whole. With services and software being offered for nothing, just to get users to a site, it makes it increasingly harder for small IT businesses to actually deliver and sell their products. This in the end, can only lead to a “slow” down in creativity and worryingly a reduction in product / service competition.

Be true to yourself…

If you build and deliver software solutions, always charge correctly. Don’t get caught up into trying to win endless users to then fuel a business model that is potentially highly volatile, and flawed.

If you are an end user / consumer, remember if something is free there will be a catch. You only get what you pay for in life, and this is still true of the internet and web based applications. Ask yourself, just why are they offering this for free?