Kill APIs if we want Open Finance

3 03 2021

Over the past years we have seen PSD2 come into force, we have had Open Banking (and the OBIE) both with the aim of bringing a world of APIs to banking, the desired goal, to enable third parties to gain access to banking to enable them to provide better customer experiences and choice. However, as the OBIE is being wound down, we are starting to look to the next governing body to help define API standards and ensure infrastructure resilience while also playing with the concept of Open Finance.

While Open Banking may not have proven to bring mass adoption with it by end customers, it has at least shown that there are other ways of doing things, more modern approaches. There are some great solutions now being brought to the market which are only possible because of Open Banking APIs, but it is fair to say, Open Banking hasn’t had the impact many predicted or may have hoped for.

So, has Open Banking failed? Well the short answer IMHO is no, rather it has shown that for real customer outcomes to be improved, we need to look at customers finances as a whole and not just their bank account/credit card activity. This brings us to the concept of forcing other areas of the financial services sector to provide Open API type of access. While this may seem all great, there are some learnings that must be learnt from Open Banking, and for me, these need to be addressed asap.

Standards aren’t standard

Standards are often the keys to interoperability. So, with this in mind, the OBIE and PSD2 set about setting standards of what Open Banking APIs should look like, how they should behave. Banks though have to build these API layers, knowing that they don’t really fit with the infrastructure or approach they may have within their technology stack. Let’s park the issue of legacy systems, because even with an uber modern Core Banking system, Open Banking APIs are very prescriptive and will not follow your IT design pattern of choice. Because of this, and various other technical challenges, we find that banks have to work around the spec, and this leads to interpretation. The result, a third party needs to have tweaks for bank-to-bank integration. This is a maintenance nightmare, not just for the third parties, but also the banks themselves, which has resulted in infrastructure that clearly hasn’t got the same up time as the bank’s core systems.

With standards, less is often more. Less to cover leads to better focus which leads to the removal of interpretation which leads to a robust standard. A key learning before we embark on Open Finance is that we MUST have less documentation and greater focus on accuracy.

Ditch direct APIs

If we really want to thin out our standards, then we need to focus on what data is needed, and less on the API implementation / flow. This wont thin down documentation massively, but it will allow the pencil to be far sharper in terms of accuracy and the removal of interpretation wiggle room. The second learning is that banks need to make sure their “Open Banking / Open Finance” infrastructure is resilient and fits more seamlessly with their technological approaches. Given that each bank is different, their IT strategy will be different, their core systems are different, their capabilities are different, their ability to invest in Open Finance is very different this is the biggest learning we must take forward into the world of Open Finance.

So how do we solve these two issues while still providing external standardised connectivity and interoperability amongst financial services companies. The answer is simple, move with the times…

Direct APIs are dying off. We therefore need to move with the times and kill off this concept of direct Open Banking and Open Finance APIs. Modern architecture uses Event Patterns and not direct APIs. With an event pattern, components (software) raises /publishes an event to an event broker. The broker has subscribers who then receive that event and can process it accordingly. There are many benefits here, including the fact that publishing and consuming events is consistent, no matter what the system is you want to integrate with or what process you wish to trigger. The API for publishing events is consistent and does not change, so you are abstracting API change away from your system. In addition, the beauty here is a single event can be picked up by multiple subscribers, and therefore promote parallel processing. You can see why direct API integrations are dying off…

Event brokers and orchestration

If we want to provide Open Finance, then financial institutions need to expose an event broker. Third parties can then push events onto that event broker which can be picked up by the financial company and acted upon. The financial companies’ implementation becomes irrelevant at this point, rather it is down to them to simply act upon the event and return an event if required. This gives them freedom to architect their solution in a way they know will work, in a fashion that plays nicely with their IT strategy and in a way, they can improve resilience. This also makes them far more accountable if they are unable to meet certain up-time obligations.

From a third-party point of view, event broker APIs very rarely change, they are constant. This means the focus becomes that of what data is within the event, something that can be specified and made extremely concise. From institution to institution the approach will be unified as to will the experience for the third party. This removes the challenge largely of API management and supporting a plethora of direct APIs and their versions. Essentially API implementation and change has been abstracted away.

This is how we can move to a far more prescriptive standard regarding Open Finance while at the same time, simplifying implementation.

I should also add that event patterns will dramatically improve the customer experience and make everything feel far more integrated – when compared to that of multiple APIs from multiple providers all of which have to be triggered in specific orders.

Implementation

Financial Services need only leverage small aspects of the Cloud to enable this new approach. Both Azure and AWS have highly mature, robust event orchestration capabilities, and most banks globally have relationships with both Microsoft and Amazon. Simply utilise these cloud providers orchestration capabilities, technology such as Event Grid and Event Grid Domains from Azure will do the trick.

The setup is consistent and simple across the financial services organisation and for a third party. The implementation by the financial services organisation is behind the event broker and therefore they don’t need to worry about following specifics, rather they hook directly into what works for them best. The standard becomes highly data focussed in terms of what the data being published onto the event broker looks like – standards such as ISO 20022 will help here and Microsofts Common Object Data Model (for financial services) will also help.

Summary

Open Finance will provide dramatic improvements in terms of customer outcomes once in-place. Better access to financial products, improved transparency, better customer services and new innovations that can be taken advantage of will all start to happen. But this can only really happen with better standards regarding data and simplified implementation approaches – for both the third party providers and financial service organisations.

Direct APIs bring with them a level of complexity which is simply not required in todays modern architecture. By moving Open Finance away from this now dated construct and towards that of Event Patterns, Open Finance becomes far easier to implement and execute successfully. Here is to the death of Open Banking APIs and the birth of Open Finance Eventing….





Building a bank…

27 04 2019

Building a business is one hell of a challenge, so when Nick Ogden (yes, the founder of WorldPay) said to me that he wanted to build a bank, I was a little apprehensive. But before I could really respond, Nick explained that he didn’t just want to build a bank, he wanted to build the UK’s first Clearing Bank in 250+ years!!!!! I think most people would be intimidated, however, we felt that this was one of those rare opportunities that you get to re-imagine how an entire suite of services could be delivered. Back in 2015, this was the birth of Banking-as-a-Service (BaaS). 

TYou see, Banking-as-a-Service (BaaS) is a technology play, it’s middleware, its banking deconstructed and delivered in granular, individual services. Plug them together, build our your capabilities and you can start to deliver banking outcomes for your customers. But Banking-as-a-Service, BaaS, is more than just technology, it’s leveraging all aspects of banking, including your balance sheet and more….

Banking-as-a-Service I feel really was born in that room with Nick Ogden back in 2015. This really was the start of our journey, the journey of building out ClearBank. Not only have we built the first Clearing Bank in 250 years, it’s the first Clearing Bank ever to join the UK’s main payment schemes all at once, the first ever to build out its own technology and the first ever to be built completely with Cloud technologies, and the first true provider of Banking-as-a-Service. A quite a few firsts there….

At the UK’s Azure User Group, I talked about some of our journey, some of the challenges and actually, how would I build a bank today…You can view the video here….

 





Machine Learning and financial crime

11 10 2018

When we founded ClearBank, one of the key questions we set ourselves was, “how do we leverage technology to be better”. When I say “better”, we meant better at particular services or functions that a clearing bank have to undertake or offer. This one question really me on the journey of delivering a true Banking as a Service (BaaS) offering to other regulated institutions and FinTechs. It’s also why we wanted to explore Machine Learning to help in the fight regarding financial crime.

I will be honest, I don’t find “RISK” management fun, however, as part of managing risk and trying to mitigate financial crime, things can get quite innovative, and that is where the fun can begin. These areas have some real obvious, yet rather powerful application uses for deep machine learning. I know some will call this AI, but it’s far too narrow to be called AI (I have a real bug bear with people calling things AI when it is Machine Learning – let’s save that for another day though). So, these use cases, what are they:

  1. Fraud detection
  2. Anti-Money Laundering

Most of us when we read fraud detection think of our experiences with our payment cards, either someone else is seemingly able to purchase “stuff!” with our cards, or we get stopped and cannot use our payment card because the bank thinks something fraudulently is going on. However, fraud is wider than that, think ID theft, actually having your bank account taken over by other individuals, these are two other areas of fraud that impact many of us today. So, how can machine learning help with actual transactional based fraud, and fraud such as account takeover? The answer is to use it to learn about you, you as an individual, and what I like is also learning about you in context of your peers activities. I will come back to that one in a moment.

The second obvious use case is that of AML (Anti-Money Laundering). This is where we can use machine learning to help identify money movements that could indicate a form of money laundering, especially within closed groups. One of the benefits here of being an actual direct clearer (as in connecting to all the payment schemes) is that you can track the money movements across all the channels, helping to gather sufficient data that a machine learning platform can start to identify money laundering techniques.

For the purpose of today’s post though, let’s just focus on fraud detection…

 

What’s normal?

Let’s use machine learning to learn what your transactions look like. Believe it or not, most of us are creatures of habit, we buy coffee typically from the same shops, we purchase our lunch or shopping at similar times in the day from similar locations. We visit restaurants on date night (which for me is always a Friday night with the wife), we drink at the same bars etc etc etc. You get my point. Machine learning can take all that data and start to build a profile of your normal activity. Sure you will have the odd splurge on some big ticket items, a holiday, a sofa, a car etc but across all your activity Machine Learning can build a pretty accurate picture of what looks like “normal” activity for you, as opposed to what looks “strange” for you.

We can apply similar learning to how you access your account, locations when you access it, the devices you use, the time of your access etc. These data points help form a profile again of what “normal” looks like, and therefore “strange” can be identified.

This all sounds great right, however, how many of us have incidents when we find we cannot make a payment, or we get alerts saying, “due to fraudulent activity your card has been suspended”? This can be the result of a “rules” based matrix approach, trying to spot fraud, or a machine learning implementation that could be better. Essentially your provider is identifying “strange” and creating what we call a “false positive”, in other words, thinks it’s fraud when it isn’t.

 

I love context

With Machine Learning, you can add layers of learning, so why not add an additional layer that looks at the context of “normal” or “strange” in relation to your peers. Let me give you an example, because without it I don’t poses the capabilities to articulate what I mean…

You never gamble on horse racing, however, it’s the Grand National here in the UK, and you fancy a flutter. When placing your bet, this looks like “strange” activity for you and your account. It can easily be flagged as attempted fraud and you are stopped from placing that bet. However, if your banking providers machine learning platform understands “context” it can make a better assessment. See, the Machine Learning platform could learn that it is the Grand National, it could also learn that your peers are also all placing bets on that race, this too could look like “strange” for individuals, but as a group of you, all of a sudden this doesn’t look like strange activity. Essentially, your Machine Learning platform has learnt the “context” of that activity, therefore it looks “normal”. The result, well instead of getting stopped from having that flutter, your horse comes in, you make a fair few “quid” and everyone is happy…. The power of Machine Learning with “context”.

 

Compute compute and a little more compute

Machine Learning is highly powerful, and I hope you see just how capable and helpful it can be at protecting your account from fraudulent activities. However, you need data, lots of it, and lots of compute power to crunch those numbers and algorithms to actually provide a decent Machine Learning based platform. The challenge therefore is having enough compute power to learn at an individual level, but also at a group contextual level. Until the Cloud really came along, this made Machine Learning a tool that only the real big players could leverage, simply because of the cost of purchasing enough physical compute power. That’s all changed, the cloud allows us to elastically scale resources associated with Machine Learning up and down, which drastically reduces the cost involved. It also brings far greater flexibility in terms of how these platforms are built and connected into the banking systems.

At ClearBank I always wanted to ensure we had sufficient compute capabilities, that’s why our Machine Learning solutions reside within our Azure cloud, giving us access to all the compute power we need, when we need it. We have partnered with some pretty cool technology companies too, such as FeatureSpace, enabling us to build out deep powerful machine learning solutions to fight financial crime, which do understand “context”.

 

Quick recap…

Essentially a good Machine Learning based solution can protect your account from fraudsters. It can learn what normal looks like for you, and when understanding “context”, it can even spot when activities are yours that typically don’t fall into your normal activity. The keys to unlocking this level of capability is harvesting enough data, and having the compute power to process it all. The cloud here is an enabler, helping financial service providers take advantage of the endless scale of the Cloud in the fight against financial crime.

It would be great to hear your comments and thoughts on this, but also any ideas or applications where you can see the use of Machine Learning really having an impact in financial services.

 

 





ISO 20022 and banking APIs

16 03 2017

I think there are a few perceptions of what ISO 20022 is, which are actually quite wrong, or unfortunately just focus on one area of 20022. In this post, I don’t want to go through ISO 20022 in great detail, rather give some background pointers on it, and to look at how ISO 20022 can be used in a banking API.

So first off, what is ISO 20022. Wikipedia has it as an ISO standard for electronic data interchange between financial organisations. Quite frankly this is very narrow of what 20022 actually is. I would better describe it as a framework for what best describes financial based product types, associated data, business processes and finally, the exchange of information – though not just between financial organisations. I prefer to think of ISO 20022 like this, though it is technically a data dictionary and business process definition list. 20022 has been around for some years, I think over a decade now, and all throughout that time, institutions have been contributing to the standard, growing it and making it more valuable to the community (in my opinion).

iso-20022-for-dummies-1-728

Though this isn’t a new standard, adoption has been slow. Here in the UK we have, at best got “interested parties”, though that interest continues is gathering more and more momentum. None of the UK payment schemes operate on ISO 20022, though this year has seen the big three (Faster Payments, Bacs and CHAPS) working on or publishing documents that “map” ISO 20022 message interfaces to their native scheme interfaces. In addition, I don’t know of any banks that operate ISO 20022 for messaging, let alone for definitions of products, data capture and business processes – exception being ClearBank. You may be wondering then why ISO 20022 is worth implementing? Well, there are a few big wins in implementing it, the first being you can standardise your entire financial organisations processes no matter the types of products you decide to create/model, the services you wish to provide and the banks/schemes you may wish to integrate with. That drastically reduces the amount of code duplication, it centralises shared processes reducing components that require testing, eases maintenance and all in all, makes life much easier for your business. ISO 20022 reduces RISK across all aspects of your business. In addition, 20022 is gathering momentum, it is simply a matter of time before ISO 20022 is enforced on institutions.

Now armed with this little bit of knowledge, let’s look at some of the key elements that make up ISO 20022.

 

Definition of financial products

Believe it or not, ISO 20022 describes what sort of financial products can be created. This is done at quite a high level, I prefer describing 20022 as a framework, as that’s really what it does, provides you with a framework in which to describe your financial product. For example, your current account, or savings account in ISO 20022 is considered to be a cash backed account product. (Refer to the ISO 20022 documentation for its formal name). 20022 describes other forms of products at that high level, remember it covers all types of financial organisations, p0roductgs and services, including things like shares and derivatives. However, let’s think of high street banking, think of all the flavours of banking type accounts you can have, they all fall back into that top level “cash account” product type.

In defining a product like this, ISO 20022 also provides information on associated data that should be attached to a product when a customer opens/has that type of account product. If you’re a FinTech, or challenger bank, then grasping this could in the long run save you lots of heartache, while at the same time help build out your account type specifications.

 

Account holder detail

Just as ISO 20022 helps define products, it also defines what data a financial organisation should be holding on account holders. Not always done in the most intuitive of ways, none the less, the standard does expect certain data to be held. Most of this is common sense, but there are the odd items in there that you may have overlooked, so from that point of view, ISO 20022 will help you validate what data you’re holding. Please note though, this isn’t definitive, use it as a framework, hold more data if you have a business case, and not all of it if you do not. Remember, data protection legislation, along with lots of other data considerations when it comes to holding account data.

 

Business processes

I think many forget or simply don’t know that ISO 20022 sets standards for very specific business processes. Keeping in mind, that 20022 handles ALL forms of financial organisations, there are lots of processes in there, however the two that really interest, I would say the majority of banks, credit unions, FinTechs, challengers etc are around Payment Initiation (known as PAIN) and Payment Clearing and Settlements (known as PACS).

ISO 20022 defines these processes. Now again, they are really to be used as a framework, but if you are a challenger bank, or a FinTech, it is well worth spending some time to familiarise yourself with these, they give you a definitive process that enforces you do the right things, such as consider your regulatory reporting requirements.

 

Message interfaces

This is the most commonly known feature/function of ISO 20022, the definition of financial information exchange. PAIN and PACS messages are fully defined, now I’m not always one for “standardisation”, as I often find it stifles creativity and innovation. However, when it is applied to something that is “fundamentally part of the fabric”, if you like, of shared interests then it works brilliantly and encourages innovation. Where would we be without HTML and the W3 standard? For me, financial transactions, their definition should be standardised, this will help stimulate creativity and ultimately financial service/product innovation.

 

Banking APIs

There is a lot being made of banking APIs, or the lack of. There are two big factors why we don’t have open banking APIs IMHO;

  1. Aging legacy banking systems
  2. Risk factors
  3. A standard approach

The first one is pretty obvious, but the second is the main challenge that really faces banks. That Risk is in two main areas, the first securing the API, and that can be challenging if you want to make it open, the second is in terms of who carries Risk of the payment. With PSD2, that Risk conversation needs a lot more thought. The third issue is a standard for the API or messaging.

ISO 20022 provides message interface definitions, and it really is the most obvious choice for banks to build APIs around. However, ISO 20022 doesn’t have a standard or process for securing message interfaces / message exchange channels. I personally think that is a good thing – lots of personal reasons why which I will not run into in this blog. ISO 20022 messaging is also an XML based interface definition, nothing wrong with that, but that can be heavy on network traffic. JSON is a much better choice for message interchanges, especially when we look towards RESTful based APIs. So what is the answer….

Banking APIs can be delivered via REST, they can be secured using Public/Private Keys and tokenisation, all technologies and methodologies that have proven track records for security. So the only challenge is the XML message definition being wrapped up in a more appropriate format, JSON. Well that’s not a challenge, simply serialise XML payloads as a string.

So while ISO 20022 doesn’t define APIs it provides us with a message interface standard, which can be used as an API payload. In the open banking API debate, the answers have been sitting there for quite sometime….

 

More on ISO 20022

There is lots of information on ISO 20022 at http://iso20022.org which is obviously a great source. There are digital libraries (java classes) though I would recommend the written documentation. Obviously a standard shouldn’t tie you to a technology, like Java, however the XML schema definitions have a little left to be desired, so when importing using other tools than those used to create the digital libraries you end up with lots of class duplication. Frustrating I know, but a skilled dev can get around that by writing some additional code. It’s something that needs addressing and probably will do over a longer period of time.

There are also lots of guides on the standard, I’ve even seen a “dummies guider to ISO 20022”, you obviously know its something worth learning if there is a dummies guide….





First new UK clearing bank in 250 years

6 03 2017

I’m hoping that if you are reading this blog, you are already aware of ClearBank (https://clear.bank) and have probably read some articles on-line. To be honest, it’s been a lot of hard work and it was great to finally watch Nick Ogden, ClearBank Chairman, up on stage announcing the official launch of ClearBank on Tuesday.

The title of this blog says it all in some ways. In recent years, we have seen a lot of challenger banks being formed in the UK, partly due to the increase demand for competition from us, the consumer, and partly because of the hard work put in by the regulators to encourage the creation of banks. However, it’s been 250+ years since a new Clearing bank was formed. 250 years! When you think of all the changes this planet has seen in those 250 years, changes such as the first train journey in 1804, the invention of the bicycle in 1816, how about the wright brothers first flight in 1903, and then two of my favourites, 1959, the first man in space and what about the first commercial use of the Internet in 1995, it’s almost unbelievable that there hasn’t been a new clearing bank in that time.

In the UK, until the launch of ClearBank, the UK had just 4 fully fledged Clearing Banks. Back in the 60s, that number was at 16, but due to strategic takeovers, market consolidation, the UK has found itself with just 4, until now, 4 has become 5.

 

What’s the difference?

You may be thinking what’s the difference between a challenger bank and ClearBank, or a clearing bank in general? Well its quite simple, banks that aren’t clearing banks rely on one of the clearing banks to provide them with some level of clearing services/accessibility to the UK payment networks. That’s every bank you can think of that operates here in the UK with the exception of the big 4, Barclays, HSBC, Lloyds and RBS. That means every bank and FinTech relies on those banks, and therefore those banks IT infrastructure, those banks integration services, operational processes oh and the fact that they are ultimately competitors.

ClearBank is utterly different. The focus is on providing access to clearing services, access to UK payments networks and core banking services to other banks, regulated organisations and FinTechs. The bank isn’t aimed at the consumer banking space, so ClearBank isn’t competing with its customers for customer accounts. The banks entire infrastructure and services have been purpose built, built now on todays technology ready for customers to utilise. As a FinTech, bank or other regulated business, you can finally connect to clearing services and the UK payments network through technology that is designed with this century in mind, via RESTful based services, through a proper SOA! This means ClearBank will be helping FinTechs innovate. As a previous FinTech entrepreneur myself, I know all too well just how restrictive legacy IT from your bank can be, how frustrating it is to your ability to innovate. It’s so frustrating to have an idea, a vision, but to realise that someone else’s technology stops you from being able to deliver.

 

The impact?

ClearBank will help and encourage market competition, it makes it easier for institutions to access clearing services, it makes it easier and drastically reduces the costs involved for institutions to offer their own current account services, and it makes it drastically easier for FinTechs to innovate and build on banking services that are fit for the 21st century.

The impact is simply massive on the financial services industry, and massive to how banking will be delivered in the UK over the next few years.

 

Find out more….

If you haven’t done a basic search for ClearBank in your news feeds, Bing, Google or even on Twitter then do so. There are hundreds of articles now available on-line for you to read at your leisure. However, you can learn pretty much everything you need to know from the ClearBank website, https://clear.bank and this short video….Enjoy…





Death of the PC? You don’t have a clue what one is

13 10 2015

Apple, Google and Microsoft have only just finished their events and we are now subjected to the usual fanboi articles from the press. Now, I’ve come to get used to this, articles that feel so unbelievable biased to one of these tech giants over the other two, often stupidly so, but recently, the articles themselves just are not understanding the basics of technology or what’s been going on over the past decade, let alone what appears to be happening right now.

My main gripe though in this post is the lack of understanding what a PC actually is. It seems that journalists and fanbois alike, think a PC is a 1990s desktop PC, that’s simply crazy. In addition, people like gartner and forbes with their market analysis constantly needs to re-asses the definition of a PC in the tech market, simply because PC in its pure form covers the vast majority of devices available.

Mobile is PC

First off, let’s just clarify one thing, a mobile phone or a tablet is a PC. PC = Personal Computer, so mobile devices if anything fit that terminology much more closely than what everyone it seems thinks a PC is, a good old 1990s desktop. Mobile is so personal, and it’s a computer, so it’s a personal computer.

As soon as you grasp this, it becomes clear why Microsoft (who seems to be forever linked with desktop PCs) is starting to make its own hardware, specifically aimed at “mobile” and more importantly, expanding that market away from where the likes of Apple and Google dominate. By that, I mean pure “mobile”, as in more focus on personal, less on the computing aspect.

What’s been going on

Since Apple really turned a mobile phone into a form of PC, the market has been shifting towards smaller, more personal devices, and as such, removes the need for homes etc to purchase a good old desktop machine. What has been a complete success is actually removing functionality and reducing computing power. I for one could do more with an original windows mobile pda device than I could do with an iPhone for example. Hell, I used to be able to control servers from that thing. But, the usability of it wasn’t simple, and to be honest, the vast vast vast majority of users use nothing more than a web browser and a handful of simple applications on mobile devices. Because of this, Apple made another great innovation, and that was simply making the phone bigger, so it was easy to use on the sofa. Enter the iPad.

Tablets really are where the majority of day to day users now carry out their computing (if not still on their mobile phone). The reason is because most tablets again, with the web, and access to good apps provides everything the majority of users understand. However, sales if iPads etc seem to have reached that point of market saturation, and that’s not a surprise, end consumers cant keep on buying, buying and buying the same thing. In essence, the PC market is now moving away from desktops to tablets, but that’s still the same PC market.

Microsoft trying to be different?

With the release of Windows 8 and the Microsoft Surface, Microsoft essentially said “yes, we are very late to mobile devices, but we have a vision were these devices are just as powerful as the desktop you used to have”. Now the reason this isn’t that popular is because the vast majority of users doesn’t need that power or complexity. Hell, the tech journalists don’t even understand that’s what Microsoft is trying to do, nor why.

However, the Surface Pro device hit some notes with large chunks of the mobile PC market, and that chunk was focussed around productivity. Though the majority of users out there don’t need to be productive, there is a market for people who are productive and want productivity from their devices. How many people do you know (in business) who turn up with their iPad. They may make some notes on there, but then when it comes to carrying out anything worth doing, pull out a laptop that appears to be a number of years old? Essentially that user is now carrying two devices around with them? That’s not the point of tablet or mobile computing is it.

Microsoft therefore tried to provide for that niche market, in the hope to get a foothold I believe and then expand that to us daily users. It’s taken until Windows 10, and the most recent launch event from Microsoft to really start to show how effective this approach is. With the Surface Pro 3, and Windows 10, Microsoft delivers a device that is Mobile. Its not an old desktop vision for the company or Windows, rather its Mobile and personal first, however with no computing or productivity compromising.

Going forward

Who knows what the market will do. However, Microsoft must be hitting the right notes with sales of Surface Pro devices doing well. You have to just look at Apple and the iPad Pro to see that Apple and Google are aware that Microsoft approach will see them selling devices and potentially taking away market share from them both. After all, why carry an iPad and a Laptop? Or have an iPad and an old desktop machine at home or work when you can have a Surface Pro tablet that is a tablet, your laptop and with a “dock” accessory, replaces your desktop machine too. That’s three devices in one?

For businesses, Surface Pro allows them to provide a single device to their employees, and takes away an utter nightmare regarding provisioning of hardware, policies, security, ISMS etc etc. For consumers it brings the same common sense approach. Why have two/three devices? Why not have a tablet that is my laptop and desktop? Apple and Google have spotted this is a real threat, hence the release of their own “pro” versions of their tablets, though neither has the innovation here or capability to compete with Microsoft Windows 10 or its power on a mobile tablet.

We see that Microsoft is going further with this, especially with Windows Phone 10 and “Continuum” enabling your Phone to replace your desktop with the simple connection to a dock device! That’s your phone powering a real desktop scree,, keyboard and mouse, allowing any user to be productive with just their phone. Enter universal apps from Microsoft Windows 10 and you really see that Microsoft is banking on mobile pcs as actual computing productivity devices, not just personal devices. This theme continues with Surface Book, a laptop first, that can be your tablet (detach the screen) or desktop replacement.

What’s clear, is that the “mobile” market is the PC marketplace, and that mobile appears to be now embracing the need for productivity and computing power. With the market now moving that way, is Microsoft on the right path to take pole position in our computing lives again? Is Microsoft devices along with Windows 10 on the right path, which is all about mobile computing experiences across a range of devices, providing us with real freedom of choice on how to carry out our computing activities while not compromising on productivity or power?

In a recent article in the daily telegraph (Best of luck Microsoft, but the Surface Book isn’t going to save the PC) I couldn’t but think “Oh my God this guy just doesn’t have a clue”. If you think a laptop is a traditional “PC pitch” from Microsoft then you don’t have a clue about what has been going on, what a PC really is or what we are seeing from the tech giants or the marketplace. To be fair though, graphs showing PC sales don’t get it either, they focus on traditional desktop machines, which is a narrow view of the PC market.

One thing we must also remember is that a desktop is easy to upgrade. Many many many consumers out there have old desktop machines and simply update them. The same can be said of businesses, with simple upgrades to RAM, most desktop machines have their life extended quite considerably. Throw into the mix that you can still run Windows 10 on these devices and why do you need to buy a desktop as often as any other device.

The traditional desktop may not be the entire market anymore, but as for the “PC” market, it is simply growing and growing with many more devices delivering personal computing experiences.  Dominating the PC market is still the playing field, the devices just look different!





Anyone innovating?

1 10 2015

First off, I’ve been a bit quite on the blogging front for a little while – sometimes real work takes over and it’s hard to get motivated to post a meaningful blog….

So, I’ve sat through two rather dull technology events the past few weeks. First off Apple really did disappoint with their new releases, nothing new there at all. No, tell a lie, I did quite like the pressure sensitive screen feature on the new iPhone. It’s quite innovative, but its value is really hard to justify. Would I upgrade to the new phone because of that? Nope, but that doesn’t mean millions of “fans” wont, quite the contrary really. The second event was that of Google. Now this was awful. Dull devices and nothing new at all….

One thing though that I did notice in both events is the desire to copy innovation from a company that apparently is uncool and hasn’t innovated since the late 90s…Yeap, Microsoft. It seems that Microsoft new approach to a single OS across all devices is starting to pay off. Mix that with the Surface Pro range of devices, and there is a real movement in the market towards “hybrid” tablet/laptops. This is clear to see by the launch of the iPad Pro and some new Google option (its name is awful and reminds me of a fax machine). The Google copy though is blatant. The device looks like a Surface Pro all day long….

Why copy?

It seems that people are starting to realise that they can have a single device that acts as their tablet, but can also be their tool of choice when it comes to productivity. Business IT departments have started to realise this and now it seems are some of us consumers. I myself use a Surface Pro 3 to replace my laptop and my work desktop PC. It works brilliantly in both environments, especially with the Docking station. I also use it as my “tablet” machine that does find its way to the sofa – where it is of great use like most tablet devices.

With this in mind, both Apple and Google have to be aware that maybe “mobile” only tablets have a shelf life, after all, can both companies really expect businesses and consumers to continually shell out for multiple devices when one could do the job of three? I think there is an awakening that actually, Microsoft has been the innovator in the past 18months, and with its Windows 10 OS and launch of Surface Pro 4 coming any day now, that there could be a real market shift away from dumber tablets towards tablet/laptop hybrids. If that’s the case, Microsoft is a long way ahead of the game here, with both Apple and Google only offering lightweight mobile Oss on their devices.





The big bank cyber cover-up

15 04 2015

Something that I have long suspected (and been aware of) is that banks don’t like admitting when money goes missing. It doesn’t matter if it is their money, or yours and mine, the point is if anything goes missing it looks bad for an institution that is supposed to be your secure holder of money. It really doesn’t matter the circumstances either, be it with a dodgy employee doing something naughty at the cashier’s desk, or customers being subjected to cybercrime and fraud, the fact is the bank won’t report it…This thought of mine is backed up by a statement made by the City of London Police chief Adrian Leppard, claiming that he believes up to 80% of online crime goes unreported. Have a read of this article in finextra http://www.finextra.com/news/fullstory.aspx?newsitemid=27226

The challenge

So why is cybercrime growing so massively? The simple fact is, something I’ve been complaining about for a long long time now, is that no matter what you do, you cannot secure something that is inherently not secure. What do I mean? Well card details are not secure. They are printed on the thing, nothing sophisticated is needed to get hold of card details at all. This means card schemes, banks, payment service providers, online payment gateways, businesses, all have to spend vast amounts of money on trying to prove that those card details (at the point of a purchase) are in the hand of the owner. The simple fact that I can get those card details so easily, means that for a person willing to undertake some cyber fraud or card fraud in general, it’s easy, it’s a weak point in the system.

My point is proven even when you add technology upon technology upon technology. Just look at the recent issues with Apple Pay. Apple, claiming the system is so secure is actually not a million miles away from the truth, if Apple could secure the card details that were added to the device, but since these are not secure in any way shape or form, it’s easy to just add other peoples card details to my own Apple device and away I go…

The solution?

The solution is so blatantly simple it frustrates me. Move away from Cards! We don’t move away from cards because of the cost of the card scheme infrastructure, an infrastructure that is so massively outdated in today’s cyber world. Card schemes are simply easy pickings for cyber fraudsters.

When I say move away from cards, I don’t mean just replace the physical card with your phone, ala Apple Pay, I mean ditch the scheme itself. There really is no need for a card to be required in a transaction, this is proven by a number of mobile payment technologies out there that move away from card schemes and look at their own scheme effectively, utilising “e-money”. These businesses / schemes have a massive opportunity to provide security that simply removes fraud, build technology built with modern day living security in mind, and all of a sudden, the fraudster’s life is much much harder. If you detach from the dependency on a card scheme, you have payment systems that are secure, you reduce fraud, you reduce risk, and you drastically reduce the cost of a transaction for a business, and ultimately the cost of products / services consumers purchase.

The only issue, business adoption, educating businesses of the benefits to them, the cost savings and the difference in user experience. That’s the massive challenge, something why mobile payment start-ups are failing. Business owners simply don’t have the time to be educated on this stuff….

So the company that cracks that nut, could get a new scheme out there and start reducing the levels of cyber-crime….Sure the banks eventually will like that idea!





The Joel test

10 02 2015

Yesterday I got asked my thoughts on the “Joel test”, as a good friend of mine got the bad news that his development team is scoring just a 7 on the Joel test. He wanted to know what it is and “Is that score a cause for concern?”

This post is going to be a little tech focussed as I am sure you are guessing, but if you are a CEO/CFO and want to know what’s going on (what you’re spending your money on in an IT development team), then you will find this post of value.

Now, I’m not a strong believer in trying to measure your development team success or their strengths by any means other than, are you happy with the product being delivered? All too often we get caught up in some form of metric for measuring just how good something is, and while we are doing that and maybe getting great “scores” we seem to lose sight that the product being delivered is actually poor. However, that being said, I think the Joel test is a good indication of your software development environment, and if that is in good working order you are at least giving them the best chances to succeed.

The Joel test is dead simple, and though I’ve read lots of opinion on it not working for Agile, I simply have to say – use a bit of common sense and apply it in the correct fashion to your preferred development methodology. I am a strong believer in agile and SCRUM, we operate that here religiously, and I would say our Joel score is at 11. Not the perfect 12, simply because I don’t always fix bugs before continuing on with new development work, I personally prefer to address bugs towards the end of a development cycle.

So here we go, the Joel test:

Do you use source control? You must be saying YES to this, simple as that. Good source control will also provide you with build services for continuous builds, see a later question.

Can you make a build in one step? Should be a YES. Builds or build scripts or continuous build services ensure your code is at least always able to build and run. When a build is broken, you have to fix that before anything else, and what’s great about a continuous build is you find these problems out sooner rather than later.

Can you make daily builds? See above I would say

Do you have a bug database? You MUST have something like this otherwise you have no hope tracking issues and fixing them. You don’t even need to be that sophisticated, though I like my UAT testers to push bug issues into the same control we use for specifying out storyboards (SCRUM).

Do you fix bugs before writing new code? This is the one I let slide. I make sure everyone is aware of them, and if they are in an area of the system that will be worked on then YES, let’s do that. However, often bugs are not in the same areas, and in such cases I prefer to keep the development velocity up and come back to those bugs at a specified date and time (typically the start of the following development SCRUM).

Do you have an up-to-date schedule? Now some will say NO to this as they use XP or something. Personally XP is hit and miss. SCRUM lets me specify out what storyboards we need to work on, and then we work on them. We don’t have an old fashion specification as such, nor an old fashion schedule, rather we have lightweight roadmaps and storyboards, because that is what SCRUM needs. So I still answer YES to this question, though we use SCRUM.

Do you have a spec? You need to have some spec, so if you answer no to this, then your development efforts will fail. SCRUM provides developers with a spec in terms of the storyboards they follow with the identified tasks. Without them, you have no hope.

Do programmers have quiet working conditions? This should be a yes, even if you are using XP. Collaboration is always ok, but the conditions will on the whole be conducive to concentration.

Do you use the best tools money can buy? We do, but I don’t think this is the end of the world if you write no to this. I personally like to push the team forward as the best tools typically help productivity.

Do you have testers? I hope you answer YES to this.

Do new candidates write code during their interview? This is harsh, but I insist on this, and what’s worse I insist it be done with just a pen and paper. I’m not looking for syntax, rather a good understanding of OOP and problem solving.

Do you do hallway usability testing? Not sure many people do this, but I do like it. I especially like expanding this out to focus groups if and when you get the chance. If you don’t have resources for this, hallway testing can be easily completed, just get some friends and family involved J

Anyway, that’s my take on the Joel test, don’t get too hung up on your score, but like Joel states, a score lower than 10 indicates serious development problems…I would probably say lower than 9 is big trouble…





Wearable’s…Would you?

4 04 2014

The vision of the future is always something you either love and want to embrace, or, like me many times, you find yourself sitting back laughing and wondering what planet these people are on. That being said, I’m sure people did exactly that when we went looking for the Americas, or someone said “hey, I can make a ship out of iron”….But in technology, and with the whole world of blogging, we have a lot more opinions that just frankly seem based on no real practicality or real thought. In some cases it feels to me that many tech businesses develop tech simply because they can, rather than thinking “does this have a benefit to real people?”

In this post I want to have a quick look at wearable technology with payments.

 

Wearable tech

Ok this isn’t anything new, hell, my first watch I was given back in the day, was and is a form of wearable technology, so let’s not start thinking wearable tech is something new. It’s also nothing new for wearable’s to steal from tech we already have. Does anyone remember the good old Casio calculator watches?

In some cases, it works well, but that is quite rare I believe. When I look at the majority of watches I see in a jewellery store, the added features apart from delivering the time, is a stop watch, maybe some form depth measurement, and then a convoluted way of measuring air speed. Now there must be a reason for this, as we have had digital watches for a very long time now.

I personally believe watches are more jewellery than technology in our minds. We want them to look good, act as a status symbol sometimes and do just one thing well, and that’s tell the time. Oh, and I don’t want to have to recharge them or change the battery even. That kind of feeds into the jewellery angle too I believe. In today’s mobile world, if I want to access computing power I will opt for a device that suits the job, and that is today a mobile phone. It’s a device that I carry everywhere, and will do even if I had a smartwatch (even if I can make calls on my smartwatch) because it’s far more natural to interact with a mobile than it is a watch. Just think of touch interaction on a watch as opposed to a large touch based smartphone? Experience is everything here.

So, do I want apps on my watch? Probably a no for the mass majority of us. We will continue to opt for stunning analogue devices that show real craft and engineering over their smart variety counterparts. But would I use wearable’s for other things, not just apps?

 

Payment wrist bands

I’ve seen NFC based wrist bands now, which again look kind of cool, but really, in the real world would I be seen with that on my wrist? If embedded in my watch then maybe, but in a rubber like band, made popular by Lance Armstrong’s “Livestrong” campaign, that’s a no go. I would also be paranoid about anyone reading my every broadcasting wrist band and swanning off with the capabilities to make payments from my card details.

Wristbands are a prime example of tech companies delivering technology for technology sake.

 

Finally, Google Glass?

Some people really love this concept, I personally don’t (are you sensing I’, not a fan of wearable technology). For one, I hate things in-front of my eyes or distracting my vision. I also don’t like to look silly, and until Glass looks a little more stylish they are always going to have issues.

This is before we raise the issue of data protection, privacy etc etc. I’m not saying wearable technology will not turn out to be profitable, I’m sure it will, but, and this is the thing for me, IMHO wearable tech is not a game changer and it won’t be adopted by the majority…

Usually I am very pro-technology….