A content delivery network (CDN) is a system of distributed servers (network) that deliver web pages and other web content to a user based on the geographic locations of the user, the origin of the webpage and a content delivery server.
The world is rapidly heading towards a future when the vast majority of the population have access to the internet, have an e-mail address, and a PC or a smartphone or both. This means that for the first time, it is possible for a company to consider selling to a consumer market that literally consists of billions of individuals. The size and geographical location of the company in the physical world is no longer relevant. It is all about your presence in the digital world.
A Sales and Marketing function will obviously analyze and segment their marketplace, and their target would be to be as granular as possible. First generation websites introduced multi language capabilities in order to allow segmentation by country. But of course the ambition today is to be able to segment the global marketplace down to the level of a single individual.
This requires tailoring the content provided over the web for millions, potentially billions, of individuals. Ten years ago this would have been impossible, but today the advances made in Big Data and CDN are changing this “impossible dream” into a “realistic possibility”.
The Loopholes Involved
So the first challenge for IT is Big Data as we attempt to build accurate data on all our customers. This data can come from external sources, existing web usage patterns and corporate applications, and it all needs to be integrated and analyzed in order to build up accurate profiles for all our customers. Obviously this problem is greatly reduced for a B2B market as opposed to a B2C market.
Assuming that accurate profiles can be constructed using Big Data platforms and their associated analytics platforms, the next challenge is to be able to deliver unique content via the web to each individual customer or potential customer.
Ten years ago this would have been impossible, but today the advances made in Big Data and CDN are changing this “impossible dream” into a “realistic possibility
So the next challenge or the IT function is to work with the various content creating departments within the company to ensure that the full breadth of content is available. Most importantly, not just available at a single point in time, how will the content be continuously updated to make sure that it remains fresh and relevant? Standardized content authoring and management systems will be needed. Most organizations underestimate the ongoing effort that will be required in order to keep content up to date. It tools that deliver productivity in this area are essential to ongoing success.
Then the next challenge for the IT function is to build out a CDN capability for the entire web presence. Local caching of data will be needed for performance reasons. Tracking customer’s behavior on the web will be critical because behavior changes and evolves, and behavior change much faster in the digital world than in the physical world. I choose to call it the “physical world” and not the “real world” because I think that the “real world” now consists of the sum of the “digital world” and the “physical” world”. In business terms they are becoming “equals”.
Lastly, everything needs to be integrated. The content authoring tools to create sufficient content, the big data and analytics tools to accurately segment the customer base, and the CDN infrastructure to deliver the targeted content. Successful integration is the key to making a CDN strategy viable because the processes need to be automated and not manually intensive.
A Prospective Future Beholds
Given how far big data and CDN have progressed in the last five years, it is reasonable to expect further major advances in the next decade. Today’s new technologies will have moved into the “IT mainstream”. But some things will not change. California’s Silicon Valley will still be leading the pack and pushing the envelope on the new technologies. Moore’s Law will still apply and processing petabytes of data on millions of customers will no longer be the exclusive domain of IT giants such as Google and Amazon. It will be “business as usual” for the Fortune 500.
The art for CIOs as always will be one of timing. Does your business need to be a pioneer and first to market? Or is a fast follower strategy more appropriate? When is the right time to re-train staff and invest in the new technologies? When is the technology mature enough to support business critical processes? And can you pick who the long term winners will be as the technology rapidly evolves? Will it be the existing major IT suppliers, or will they get substituted by companies that do not yet exist?
In order to ensure that investments in future technologies are not wasted, it is probably a good idea to employ one or two architects within the IT function whose mission is to detect and track evolving technologies, and assist the CIO in making the key decisions on when and where to invest.