The Content Philosopher

Web Name: The Content Philosopher

WebSite: http://jgollner.typepad.com

ID:189154

Keywords:

The,Content,Philosopher,

Description:

These photos are separated from my Travels album because Oxford is something of a second home. I still manage to visit it several times a year. So the pathway between Manotick and Oxford is well trodden and I can likely do it with my eyes closed - and probably have on more than one occasion. This series of photographs was taken over the last few years. I have stayed at the campus of Royal Roads on several occasions and I have been repeatedly impressed by the grounds. They are in many ways a little-known treasure. Here is a selection of pictures I have taken during my travels over the last few years. I am very obviously an amateur photographer and it is not uncommon for me to forget my camera altogether when packing. What the pictures do not convey is the fact that in these travels I have met, and gotten to know, a great many interesting people. Manotick Ontario Canada is the part of Ottawa that I call home. Much of Manotick stands on an island in the Rideau River. Interestingly, the Rideau Canal, which runs through and around the river, was recently designated a World Heritage Site by the United Nations. So this means that the view from my backyard is in some way on a similar par with the Egyptian Pyramids - although the thought strikes me as ridiculous. People have questions. Sometimes they find answers. More often than not they construct answers from what they find. More frequently still they construct provisional answers and muddle along. This has always been the case. But as with so many things, what we have always done is being thrown into a new digital light as we grapple with a landscape filled with new technologies and novel techniques. One thing we can say with confidence is that, relative to the past, we have managed to stir up even more questions and simultaneously we have made it more difficult to find or construct answers. Yeah us.I was provoked to think about questions and answers by the call to participate in a blog carnival marshalled by my friends at the European Association for Technical Communication (Tekom) and focused on the question How will intelligent information shape our future? See the Intelligent Information Blog (iiBlog) for more information about this blog carnival — which is a cool idea as well as one that invites discussion of an interesting topic.I have had various dealings with this community before such as contributing to their newsletter (see TCWorld e-magazine), participating in some of their events (see TCWorld Conferences and Information Energy), and spending some time scrutinizing the International Standard for Intelligent Information Request and Delivery (iiRDS). When I saw the invocations to participate in a Mardi Gras blog parade on intelligent information, it was a bit like going to a disco night at one of the TCWorld events: they were definitely playing my song.It is instructive to note that this call to participate in the intelligent information blog carnival itself contains 11 questions, with one being repeated at least 3 times and another (what is intelligent information?) being linked to an answer that is (quite appropriately) a collection of observations from which the reader can start to construct an answer. And it is the interplay between questions and answers that I want to focus on now.So back to the topic of questions and answers. Does it actually make sense to talk about looking for answers ? Well, not really. Or perhaps we can say something a little less negative like in a certain sense and then only partially . People familiar with my penchant for logic-chopping will be rolling their eyes right about now. But I am being less obtuse than usual in this one case. My point is that it is only occasionally, and actually quite rarely, that the questions we have will encounter ready-made answers waiting just for them.Conversations about intelligent information, and especially those undertaken with people we would like to engage as sponsors or customers, will often include declarations like we want to return answers not results or we need to expose the answers already present in our content . When the conversations get serious, however, we need to add something like of course, you realize that the number of possible questions is infinite and there is no way you can answer every question that comes along . In fact, the best you can do is answer a small percentage of the questions that might be raised. And even then you can only guess at which of the questions will actually get asked and thus justify the effort of preparing answers in advance. Depression soon sets in.But we don t need to feel too bad about things. None of this is new. Communicators have always had to posit a reasonable set of questions and to fashion practical answers to those questions. And they are used to the idea that they will also provide a set of reference resources that users can use to construct their own answers to questions that no one would have ever thought up beforehand. At a higher level, businesses have always had to ask themselves questions such as what will the market look like next year or what features should our product have — questions for which answers are anything but easy or certain. And those businesses cannot be sure that they are even asking the right questions, especially in a marketplace that can change as rapidly as it can today. And still the information is evaluated, decisions are taken, investments made, and outcomes measured. Embracing the brave new digital world, we might even start to feel a little better about things. Why? Because now we have the technologies and techniques that we can use to design, prepare, and deploy information in a way that is fundamentally more intelligent — that is granular in way that befits the subject being addressed and that can be profiled so it fits into specific situational contexts. And information in this state, intelligent information, can be retrieved and assembled by users in a multitude of ways and used to construct answers that exactly match their unique circumstances — that answer their questions.It is the self-serve model taken to its logical extreme. One organization will share what they know about what they are offering in such a way that other organizations, partners and customers, can mix and match that content with their own and that of other organizations to then assemble answers that they, and their systems, can act on. This is what I was talking about, to a large and largely baffled plenary audience some 20 years ago, when I spoke of managing knowledge in the fractal enterprise . It has taken that long for us to get to the point where everyone can approach their information in this way. Indeed today, under the driving momentum of digital transformation and the fourth industrial revolution, what is now a possibility has also become an obligation. The time has come to either embrace intelligent information or take up farming, and even then you are unlikely to escape it for long unless you choose to work a small plot by hand.Applying a digital lens to our content and to the information services we have historically provided, we see that we can indeed expose the details inside the information in ever more precise and useful ways so that users can get on with constructing the answers they need. And when providing information that is granular and contextualized, we can also provide the tools that those users can use to accelerate their progress towards useful answers and to turn those answers into effective actions — at the end of the day the real objective. What tools we might provide would, like the information content itself, be components that are portable, can work in a user s environment, and interact with their tools or those they engage dynamically in the cloud. If we take this digital revitalization even further the intelligent information that is provided is also used as a way to channel the measurements taken on the outcomes of the resulting actions back to the original information manufacturers. In this way everyone learns; everyone gets better. And this feeds up into those business questions we touched on earlier about what a business should be endeavouring to do. This is what the fourth industrial revolution means for information and for information specialists. Where have we arrived? We see, I think, that questions present a complex challenge and we cannot simply pretend that we can have all the answers ready beforehand. But rather than becoming overwhelmed by this recognition we also see that if we structure and contextualize our information in the right way, if we make it intelligent, we can move past this question, as it were, to get to where what we are doing is providing the tools so our partners and customers can construct the answers they need — using our information, their own, and that of others. We no longer try to serve up enough fish. Instead we provide the capability for our customers to fish for themselves. It is an answer that can scale and one that is ultimately far better for everyone. Within my blog, I have been battering away at this topic for an embarrassingly long period of time. Below are some past posts that might be useful for people interested in exploring the question how will intelligent information shape our future? Note that when we look inside information, as we must if we are going to make it intelligent, then we find that we are looking at the content of that information and this explains the prominence of the term content in my discussions of making information intelligent.Content 4.0The Marriage of Structure and SemanticsDefining Intelligent ContentA Short Primer on Intelligent ContentThe Truth about ContentThe Birth of ContentIn constructing answers to the questions that surface around the idea of intelligent information, we are inevitably confronted with the overarching question of what is the best way to proceed? Does our future lie with Artificial Intelligence (AI) and increasingly capable Natural Language Processing (NLP) services? Or does it lie in semantic technologies and the escalating precision and usefulness of knowledge graph tools and techniques? Or do we need to invest further in advancing and applying content technologies to making information intrinsically more granular and contextualized? The answer in this one case is easy. It s yes . They are all needed and our progress on one track will need to feed into and inform the others and vice versa. It is in the co-evolution of these capabilities that we will see the real answer emerge. Then we will see what intelligent information really looks like and it is only then that we will have a full sense of how we should address the question of how will intelligent information shape our future? PrologueI have a bad habit of tackling unnecessarily big topics at inopportune times. After a year of near-complete silence, I will make a gesture in that direction once more. As this is a particularly bodacious topic, I suspect that I will be coming back to it several times both with revisions to this post and with follow-on posts. But in just the same way as applies whenever you are confronted with seemingly overwhelming challenges, the best way to start is to start.One of my less welcome aphorisms goes something like this:Think about something long enough and you will be guaranteed to be...wrongContent is a fundamentally hard problem and we have been thinking about it for a very long time. My nagging concern, and one that I cannot seem to shake (in part because it rears its head so frequently within the practical confines of projects), is that we may have thought ourselves into a corner when it comes to content. And I suspect that the area where we have collectively lost the handle on the problem of content is in how we think about and balance its two fundamental dimensions: structure and semantics. Now it is true that, in this, I may be demonstrating the validity of my own sardonic aphorism. My hope in returning to this aspect of the problem, and exploring how structure and semantics might be usefully separated and then individually optimized, we may find find a way to build a new and more durable marriage between the two domains. That is the hope anyway and as we well know, hope springs eternal.Meeting the Demands of TodayWithin my various project experiences, of which I have been blessed with so many good ones over the years including this year, it is increasingly clear that we have reached a tipping point of sorts. With the introduction of chatbots and voice assistants, not to mention Artificial Intelligence (AI) agents and an escalating demand for unfettered content interoperability between applications living anywhere within the enterprise or accessed anywhere in the cloud, we have been unceremoniously pushed off the edge of any illusions of predictability and control that we might have been unconsciously clinging to when we thought about our content assets.The Meaning of StructureIt turns out that as soon as you try to move your content, the materials you prepare to inform and engage people, from one location to another and to continue working with it there, you run into interesting challenges. Among the stories that have descended to us from the early days of markup languages was the experience some encountered when they tried to move document files from one virtual machine within a mainframe to another virtual machine and the result was a catastrophic failure of the entire system. These challenges force us to ask some pretty hard questions. Why should the ways we structure our content matter so much to different software applications? What do we set out to do when we structure our content? And do we introduce supplemental meaning to our content when we structure it and if so what meanings do we convey?In this context, by "structure" we mean the "physical organization" of content. And it does carry a meaning with it. If we try, as I have been doing repeatedly over the years, to be clear and focused when we are structuring content that will need to move between many applications, then we should isolate the meaning of structure around what I call the "publishing semantic". In effect, this is a reference to the language we use as publishing specialists when we want to organize information content in a way that is optimal for its management and for its delivery to, and use by, people. This publishing semantic has evolved over millennia and it features the ever-evolving practices around print publishing as well as the more emergent practices around what Ted Nelson termed "hypertext". Rather than downplaying this domain, I have spoken of it as the fundamental mechanism that sustains the advancement of knowledge in our world.The Structure of SemanticsSo what of semantics? Semantics refers to the study of meaning. If that were not bad enough, I generally continue with the observation that "meaning emerges within systems where applications lead to outcomes". To indulge in a little recursion, every word in this observation is "pregnant with meaning". One word I will pick on though is "emerges" as this tells us that meaning always emerges from working systems and in particular within language systems. It is not a far stretch to draw a more practical observation that meaning emerges from our content, from how we articulate, exchange, and interpret what we believe and what we want other people to believe. Somewhat uncomfortably for some of my good friends who prefer to see semantics as exclusively limited to what can be formalized for consumption by computers, this means that semantics comes after and is ultimately dependent upon content. But yes, among the systems and applications within which meaning emerges, finds utility, and yields outcomes, are automated computer systems. As this probably trumpets loudly enough, semantics is a challenging problem area and as such it is a worthy companion to structure in our understanding of content. I am, again unhelpfully, reminded of what Maurice Merleau-Ponty had to say: "Because we are in the world, we are condemned to meaning" (Phenomenology of Perception).In slightly more practical terms, semantics refers to how we define, manage, and apply meaning to things - and this includes to content or shall we say to structures within our content. One very good thing that has resulted from our hitherto mediocre attempts to instruct computers on how to do something useful with semantics is that we have been forced to apply more and more formality to how we articulate semantics. Interestingly, at least to me, is the fact that in formalizing how we articulate semantics we find that what we are doing is structuring those articulations in ever more precise ways. We use structured content to declare our semantics and this observation can actually be used to help improve how we tackle the problem of defining, sharing, managing, and leveraging the semantics of an enterprise.The Successful Marriage of Structure and SemanticsSo from the above ruminations we come away with two things (in addition, perhaps, to a headache):Structure refers to the physical organization of content and we structure content so as to optimize how we manage content assets so we can deliver informative and compelling messages to people (what I refer to as the "publishing semantic").Semantics refers to the definition, management, application, and use of meaning that emerges from and in turn informs the systems (including automated system but in no way exclusively) at work within an enterprise that seek to achieve goals and that generate outcomes as a result.So what is it that we are doing when we set out to combine structure and semantics in our content?Beyond the unavoidable basics of applying the publishing semantic to our content, what we are doing when we apply semantics is relating components within our content structure to concepts that make sense within one or more systems where applications endeavour to achieve specific goals and where they will generate outcomes that range between stellar success and abject failure. It is for this reason that we talk sometimes of "annotating" content with reference to a given taxonomy of concepts - we are associating content components with nodes in a semantic model that has been formalized in some way. Why would we want to do that? Well perhaps the systems in question can leverage parts of the content to achieve their goals. If fact this is exactly why.If our application is to help users to find specific content based on a unique context then the annotations we apply will help to filter what would otherwise be a mountain of content to the subset that will be relevant to that context. Or as another example, from a past project, perhaps a manufacturing application needs to extract very specific details from a legally-mandated document source and the annotations applied ensure that this extraction and application of content details works at a level of precision that will be compliant and will avoid catastrophic failure in the manufactured system. In this example, the semantics emerged from, and vitally lived within, a variety of systems including an external third party who controlled the legally-mandated document sources. Those semantics needed to be precisely articulated and then exactly applied to structural components within the document content.While there is a great deal to be discussed about how exactly we should facilitate the integration of structure and semantics in content assets, such that the resulting assets can move continuously between the many systems and applications operating within an enterprise, the fundamentals are clear. The semantics we may wish to apply to content structures will exist outside of the content even if, as I highlighted earlier, those semantics are articulated with reference to discourse (the dynamic interchange of information content) and articulated using structured content framed for that purpose. What we must do then is find the simplest, most intuitive, and most easily managed way to make the necessary associations between our content structures and the semantics that may apply to them.Perhaps uniquely, I have come to cast metadata in this role - as the overlap between, or intersection of, the structure and the semantic domains. I have further described it as a subset of content structure that formalizes and facilitates the connection of content components to semantic contexts. How much of the semantic domain needs to be reflected in the metadata may alter from case to case so long as the connection being established will be sufficient for its intended uses. It is this relation to "uses" that I think is critical. The "meta" in metadata actually means "after" and while this is almost universally ignored or downplayed I believe it is fundamental. If we stress this idea of "after", what we are saying with "metadata", I think, is that we are establishing the context of content structures for use by one or more system applications. And so long as we keep our notion of systems and applications sufficiently broad, we can see that this conception of metadata covers the full range of considerations including security, administration, discovery, and automated processing. It is also fully compatible with any and all possible ways of establishing the conceptual connections whether that uses "out-of-line" metadata (where the metadata exists totally separately from the content and linked to specific structural locations) or "inline" metadata (where the metadata exists within, and as part of, the content) or some fusion of the two.It is down this path, I would contend, that we indeed find a practical way to separate the concerns of structure and semantics, to optimize each in turn, and then to facilitate the efficient and effective integration of the two domains in a happy, and resilient, marriage.Mathematical ContentI actually do have some preferences on how metadata is physically integrated into content structures although even in these preferences there is flexibility about when in the content lifecycle it needs to happen. I myself, and perhaps this is a confession on several levels, like it when metadata structures are relatively complete and informative and then are incorporated by reference into the content structures as a physically simple way to establish the metadata association. This can be an attractive and intuitive approach from the perspective of the authoring experience (how the people responsible for preparing the content assets come to fully understand the contexts within which the content will be used). And this can be highly applicable late in the content lifecycle, when content assets are staged for use, when the metadata (which may be further resolved and expanded with reference to the associated semantic models) is leveraged when indexing the content for high-performance retrieval and high-precision application consumption.As a strange association, I find myself thinking, of all things, of Alan Turing and his seminal "On Computable Numbers" (1937). What I specifically find myself thinking about is how the approach to handling metadata that I have only glanced upon above might be used to facilitate a more efficient and scalable method for processing content assets. What I have been exploring is how this approach to metadata, as the physical intersection of the domains of structure and semantics, might be used to optimize content processing scenarios as the standardized handling of sets, and this, again following Turing's lead, will also help us to see and understand the limits of content computability. So it looks like I am thinking of a potentially terrifying blog post called "On Computable Content".All this is to say that there are some very interesting and very attractive reasons for looking at the structure and semantics of content with a different lens and then consciously forging a working solution that leverages the unique strengths of each domain. When all is said and done, what we should have is a marriage of structure and semantics that can stand the test of time and even withstand the many challenges that will invariably arise.But perhaps I have been thinking about all this for too long...We have lingered in the chambers of the seaBy sea-girls wreathed with seaweed red and brownTill human voices wake us, and we drown.- The Love Song of J. Alfred Prufrock (Content Engineer)T.S. Eliot What better way to emerge from a year of distractions than to tackle an impossible topic. Even if this attempt was only tangentially successful, it would feel like being a phoenix emerging from the ashes. Hence my choice for a signature graphic and hence my choice of a case study with which to illustrate, or at least gesture towards, what the content of systems might be.My choice of case study is a mega software project within the Canadian Federal Government - one appropriately called the the Phoenix Pay System. It may sound a little too provincial to be instructive beyond the insular world of Canada s capital, Ottawa, but it is loaded with lessons for everyone as it is steaming towards the unenviable title of worst software project ever. When we are building software, we usually have a vision of what failure would look like, and sometimes of what success would look like too. Few on this project would have foreseen billboards being erected publicly demanding that the system be fixed or masses of unpaid, or inaccurately-paid, public servants taking up placards and pitchforks in protest. Somewhat maliciously, I have been revelling in the spectacle. Somewhat depressingly, I have been ruminating on the fact that this debacle is an exemplar of how large software projects typically go and that the lessons from this disaster will be lost on everyone who really should be paying attention. Why these lessons will be lost can be traced back to two sources: the politicians and bureaucrats on the customer side of the equation are effectively incapable of learning from experience, and the systems integrators and consultancies responsible for delivering these systems do not want to learn them (failure, in the software business, is lucrative).Setting aside that cheery thought, let s move on to what lessons we can unilaterally extract from this disaster, in particular on what this tells us about the role of information content in software systems. This particular example is useful in this regard in part because I know a little too much about it and about the problem area to which it sought to bring order and savings. The Phoenix Pay System was named after the mythical bird that cyclically emerges from the ashes of its own immolation. The reason the phoenix moniker was taken is that this project emerged from a predecessor project called the Public Service Compensation System (PSCS) that was steered off the road in the mid-1990s. Upon leaving the military in 1991, where I had been managing a major pay and personnel software project, I was counselled by an uncommonly wise executive with a major software integrator to join the PSCS project. His words still ring in my ears: Everyone should be part of a major software project disaster, and as early in their career as possible. Strangely, I took that advice and served on the PSCS project during its brief ascent and precipitous descent (we actually initiated the shutdown of the project well in advance of its failure in the public sphere - illustrating a lesson that was then lost on the Phoenix project). I was also then implicated for almost 10 years in the court case that came out of that project, where the federal government, successfully it turned out, sought damages from the system integrator behind the project. So I can claim to have some insight into this general problem area and into this particular example.One of the tasks for which I was responsible on the ill-fated PSCS project was to establish a full-text database that contained all of the contracts and collective agreements that would bear upon the business rules that an integrated compensation system would need to apply. In effect, we set out to understand the content that the system would need to understand and effectively work upon. Complicating matters, PSCS, like Phoenix after it, sought to introduce a single system that would support literally hundreds of different departments, agencies, and agreements. In analyzing the contents of all these agreements, we were acquainted with the bizarre capriciousness of the sublunary world of human negotiations. We joked on the project that there was a unique rule that would only apply to left-handed lighthouse keepers who, in addition to the present, had worked between 1983 and 1989 and this rule would come into effect in circumstances where there was light rain forecast within 100 km of any one of the lighthouses they had worked at in the preceding 18 months. We were joking, sort of. The silliness of this scenario was in fact an expression of frustration that the business rules we were supposed to implement were just this crazy. How is software, even infinitely effective software, supposed to deal with this substratum of contractual content? The answer is that it can t. Only fools or crooks would pretend otherwise, and remembering bureaucrats and software integrators we see that we have struck gold in finding both.The debacle of the Phoenix Pay System illustrates this point perfectly. We start with a customer determined to find a mythical outcome by acquiring a magical system that would allow them to retire hundreds of small, and wilfully quirky (in line with their quirky contractual contexts), pay systems, and to reduce the number of pay specialists needed to navigate that strange world of capricious business rules. We then add a major system integrator and a major consultancy who were most eager to design such a magical system and then to write the business case showcasing the massive savings to be had by its implementation. Hugs and handshakes ensued. The software solution was hammered out, as usual in these types of projects, through the extensive customization of an off-the-shelf software product (ignoring the ill-effects of the Barnaclization of Systems), and the resulting system was thrown into production so that the reductions in pay specialists could be realized in line with the business case. But then reality reared its often ugly head and the shortcomings with the system, and indeed the entire concept, exploded into view. Hugs and handshakes ensured once again, but this time only on the side of the system integrators and consultancies - who would now be paid hundreds of millions of dollars more to first fix and then, in all likelihood, replace the phoenix system. On the customer side, the ill-consequences of childish naïveté begin to be felt as bureaucrats scramble to reassemble the pay specialists that they had only recently released and politicians spin tales about how the next tsunami of spending will make everything well. Rest assured that the cycle will repeat, making the phoenix name painfully appropriate.So what does all this tell us about the content of systems? As I have elaborated upon elsewhere, content is the substratum of communication, the physical representation of what we are saying - including what we have negotiated, and what we are designing, planning, and justifying. Within complex software systems, the content that is really important is not so much the end user documentation and training materials , important as they may be, but rather the exposition of how the system will work. This exposition will ideally be done with a precision, formality, and completeness that will allow the system to operate in a variety of ways - including through different software implementations and, most importantly, through a complex interaction between people (who will be knowledgeable and held genuinely accountable) and automation that can actually handle, and evolve to efficiently expedite, situations of unlimited capriciousness and complexity.Where content is understood as the lingua franca underpinning systems, and not merely its public face, we can start to envision, design, implement, and evolve genuinely sustainable systems - ones that work with and build upon, instead of wishing away, institutional and individual responsibility. Needless to say this would be unpopular and unwelcome as it would limit the ability of customers to eschew their accountability and for integrators and consultancies to convert the dead weight of ignorance into financial gold. Maybe this helps to explain why content, and content technologies, have historically been such a hard sell - promoting as they do a grounded approach that tries to end, instead of fuelling, the destructive cycle of the phoenix. Each year, I select an idea and proceed to pummel it relentlessly in a series of presentations, posts, and tweets. Last year it was the idea of Integrated Content. In 2016, it was the idea of Content 4.0. This inquiry was prompted by a number of concurrent discussions that have been exploring the relationship between the work of technical communicators and the emergent concept of Industry 4.0, also referred to at times as the Industrial Internet of Things (IIoT). As I would, I took up the challenge and carried it further than was probably wise. Nonetheless, I am hoping there is merit in summarizing this inquiry so that we can consider for a while what the implications will be for those who work in the business of communication.The obligatory starting point would be to set out the four stages of industrial evolution. From this we can then look at other concepts (Web, Content, Information, and Technology) and see whether similar evolutionary stages can be discerned. We can also explore whether or not there are forms of interaction or influence between what is inescapably happening in the industrial sphere and these other concepts.Accepting for the moment that history itself never plays by, or exhibits, anything analogous to stages or levels, and that it is dangerous to think about things in terms of forward momentum (aka progress), we do find that there is value in reviewing how things have changed and how these changes have impacted collateral activities, such as the way in which people communicate technical information.With all that taken as a given, let s look at the evolution stages of Industry:Industry 1.0 - The introduction of steam power to manufacturing in the initial industrial revolution (late 18th and early 19th century). Also associated with the application of emerging concepts such as the division of labour, made famous by Adam Smith in his Wealth of Nations (1776).Industry 2.0 - The introduction, in the early 20th century, of electricity and innovations such as the assembly line. Henry Ford stands as the inevitable, and well-deserved, example. Also fruitfully associated with Frederick W. Taylor s Scientific Management (1911) and the pursuit of ever-improving efficiency in the form of one best way to do just about anything. Industry 3.0 - The introduction, starting in the 1950s, of automation to manufacturing tasks. Also associated with practices and principles generally termed lean manufacturing that emerged out of World War II production techniques and that were brought to high levels of refinement within Japanese management models for streamlining manufacturing processes and supply chains. Industry 4.0 - An umbrella concept for what is emerging as the next generation of industrial practices given the combined capabilities of universal supply chain connectivity, virtually complete automation of manufacturing tasks, smart parts that convey their own manufacturing and positioning instructions, and reconfigurable assembly lines replete with data-gathering sensors that interact with each other, with smart parts, with suppliers, with logistics coordinators, with management systems, and with the customer demands that initiate all of this activity.In my initial exploration of this line of thinking, I immediately recalled a not-dissimilar set of evolutionary stages having been described for the World Wide Web. In particular, I recalled interactions with Mills Davis of Project10x whose work on the Semantic Web as Web 3.0 provides the basis of what I will sketch out here.It turns out that a set of four evolutionary stages can be overlaid quite neatly onto the 25 year history, to this point, of the Web. Web 1.0 - The initial creation of Sir Tim Berners-Lee and what some of us recall with a mixture of fondness and bemusement. The overlay of a hypertext interface onto the connectivity infrastructure of the internet which was, in the early 1990s, becoming publicly accessible for the first time.Web 2.0 - The explosive growth of social media platforms, propelled in part by the concurrent explosion in mobile computing, led to what has been called the Social Web.Web 3.0 - The progressive, if gradual, introduction of machine-readable semantics into web information content so that automated programs can perform an expanding array of useful tasks in areas such as discovery, recommendation, and personalization. Referred to as the Semantic Web.Web 4.0 - The culmination of all the preceding stages in a massive global network of internet-connected devices, sensors, and services that together can perform increasingly sophisticated activities (IoT).The original point of reviewing these two histories, that of industrial and web evolution, was to establish a landscape within which we could consider how our notion of content has been changing. Does content, in effect, follow a similar trajectory? Let s take a look.Content 1.0 - Content is created and managed as an integral part of the information product within which it is delivered to information consumers. Examples include a cave painting, a traditionally printed book, and if we are being honest with ourselves most web sites including those offering measures of design responsiveness. There is, with Content 1.0, effectively no separation made between the intellectual and rhetorical import of an information product (its content) and the formatting rendition or application behaviour that is applied to it. We should be clear here that the vast majority of content in the world exists in this state and much of it would have little to no reason to be treated otherwise.Content 2.0 - There are scenarios, many of them business-critical, where it is advantageous or even essential that the content within information products be isolated and managed separately from the format and behaviour of individual information products. This scenario typically arises when there is a need to produce multiple different information products from a single, shared source. It is at this level that it makes sense to talk about content management as the genuine management of content as opposed to something else. This is where much of the focus of the content management and publishing industry is currently directed - helping organizations to manage their content assets efficiently and to leverage them effectively in a variety of coordinated information services.Content 3.0 - The logic that drove the separation of content from formatting and behaviour, seen in Content 2.0, is taken to its fullest conclusion with the goal becoming the management of content as an integrated library of assets where all details are managed at their most authoritative source. Content 3.0 can be referred to as Integrated Content and the focus at this level is on leveraging the integrated, and integrative, nature of content to optimize how organizations operate and how they interact with their external stakeholders including customers. Only a select few practitioners in the content management industry have undertaken projects at this level although the demand from leading organizations has been building steadily. Content 4.0 - The circumstance seen at the first stage, Content 1.0, is effectively reversed with Content 4.0 as the content asset becomes the encapsulating parent for a range of information product renditions and behaviours. Content is planned, designed, created, managed, and exchanged as objects that incorporate not only the intellectual and rhetorical import but also the associated rules governing the structure and meaning of the content and an array of rendition and behaviour processes that the object can use to render that material independently or in concert with other content objects. What this means in practical terms is that the content that is being managed is highly precise so that a variety of application processes, including multiple rendition scenarios, can operate on that content with a high degree of confidence and effectiveness. This means, in simple terms, that the content co-exists with the complete array of known behaviours that it supports and it is exchanged with stakeholders who will publish and use the content and behaviour in their own environments and to meet their own goals.In some of the discussions that were occurring around these ideas, for example a LinkedIn Group converged around Documentation 4.0, the phrase Information 4.0 is widely being used so it is worthwhile considering the evolutionary stages that might apply to information . Illustrating why it is important to consider content separately from information (as working concepts), we can see in the evolutionary sequence associated with information that when we focus on information we focus on something very important - on the concepts of authority and accountability. This is where another term, document, comes into play with a document being the transactional artifact in an information exchange. The word document summons up, quite usefully, a connection to legal deliberations and this helps us to attend more closely to the true nature of what we are talking about at this point - that information is an event, a transaction, an action for which people and organizations can, and should, be held accountable.Information 1.0 - The world of paperwork, quite literally. When I discuss this level, it will call up historical examples from past empires that, whatever else you might say about them, illustrate how the careful handling and processing of even paper documents can be leveraged to undertake and sustain monumental activities. The Roman Empire, the British Empire, the Allied war efforts in World War II, the engineering feats of the Cold War era - all provide illustrations of effective information handling that quite frankly puts our time to shame. Information Management guiding light Paul Strassmann, in his book The Politics of Information Management, summoned up the example of the Roman Catholic Church as a model of the effective handling of information (with some notable recent exceptions) that modern organizations would do well to learn from. Information 2.0 - The initial attempts to conduct document transactions electronically in a way that was binding and reliable which emerged in the 1970s as Electronic Data Interchange (EDI). Anyone who has had the pleasure of working on an EDI implementation will know that in addition to coordinating lower level data definitions, and working around the prescriptive structures of X12 or EDIFACT, there is a lot of effort applied to the exchange and delivery frameworks. These frameworks, often grounded in legacy standards such as X400, were intended to establish security, control, irrefutability, and so on - such that the confidence previously invested in paper-trails could be assumed by their electronic replacements.Information 3.0 - This level refers to the emergence in the 1990s of Workflow Automation systems capable of operating across organizational boundaries and then Business Process Management systems that could do the same on increasingly more sophisticated scales. Among the factors in play at this level was the adoption of XML as a more extensible basis for EDI networks with this encompassing both data representation and security measures. In his best-selling book, The World is Flat, Thomas Friedman, a New York Times columnist, directed an unusual, but well-founded, amount of attention towards the effects of XML as one of the levelling factors that changed the face of global trade. This occurred in large part because major software vendors like Microsoft aggressively embraced XML as a messaging and data interchange format (web services) so that they could break away from the limitations of highly prescriptive EDI standards and overweening software integration frameworks like CORBA that never achieved widespread adoption.Information 4.0 - As befits the notion of the Internet of Things (IoT), Information 4.0 is really about establishing interaction frameworks between people, organizations, devices, and services so that there is accountability for the resulting behaviour and presumably a measure of effectiveness when viewed from the perspective of all legitimate stakeholders. It is my inclination, when exploring this level, to talk about authority networks that link a series of actors leading up to events such that there is traceability from observable events back to the responsible parties. As a side note, this important aspect of Information 4.0 demands very specific things from the content underlying the participating information transactions with this helping us to fully understand some aspects of what is required of Content 4.0.While we are at it, we might as well cast a quick look at the evolution of computing technology. Many of the large consultancies have been talking about the Third Platform as a nexus of current advances including mobile devices, big data, the proliferation of data gathering sensors, cloud computing services, and the growing excitement around cloud-based cognitive computing. Closely associated to this has been a yet larger concept, that of Digital Transformation, which has been even more energetically promoted by the large consultancies as an umbrella term for the business and societal impacts of the digital revolution epitomized by the Third Platform technological convergence. I would be inclined, and I have in the above illustration, to keep all of this at the third stage in the evolution of computing technology (what I have termed cloud computing and elsewhere internet systems ) and then to posit a fourth stage which is purely emergent based on the assembled capabilities of the third platform. We may only now be able to glimpse some of what will emerge under Technology 4.0 but it seems safe to say that it will entail massively distributed capabilities that make the most of ubiquitous sensing infrastructure, inexhaustibly detailed data resources, unlimited processing power, and escalating software intelligence to deliver a new orders of functionality including functionality that learns and improves over time (Machine Learning). This evolutionary scale for technology aligns very neatly with, and substantiates the evolutionary changes being seen within, the other concepts that we have been exploring - those of industry, web, content, and information.Understanding Content 4.0This illustration tries to summarize a lot of what we have covered here and to do so in a way that shines a spotlight onto the content assets that technical communicators would find themselves working with at the various stages. This illustration emerged from a collaboration between myself and Marie Girard at IBM. Essentially, Marie organized the sprawling mess of ideas seen in early discussions of Information 4.0 and Content 4.0 and used the basic layout seen above as a way to give those ideas a more accessible form. I then, as I often do, complicated the picture further. The result succeeds, despite my best efforts, at conveying a lot about what we have been exploring in this post.We can see, for example, how the focus of communicators shifts from the publications themselves towards progressively smaller and smarter content components. At the most advanced stage, communicators divide their attention between very small units of content, which we came to refer to as molecules , and the encapsulating content objects that combine these molecules into components and topics and match them to application behaviour such as rendition instructions. The molecular level of content is particularly interesting because, while it is almost never sufficiently comprehensive to be deemed standalone and self-sufficient, it is often the answer that people are seeking when they consult information products or services. We can also see that as we move towards Content 4.0, there is a greater and greater awareness of, and interaction with, the various applications that govern either where the content sources hail from or where the content assets will go as part of their publication and delivery. ConclusionAt the very least, this inquiry does point toward a confluence of forces at work across a number of fields with all of them driving towards smaller and smaller components, each exhibiting more and more independence and intelligence, and all of the them assembling and interacting to achieve ever more ambitious capabilities. This should tell us that the business of communication simply cannot sit back and pretend that the world is not changing. More specifically it tells us that, as communicators, we need to fundamentally rethink how we plan, design, create, manage, and modify content assets and how we publish and exchange the resulting information products. Sticking with the tried and true, and the comfortably familiar, is not an option. As with other aspects of digital transformation, determining exactly what we should be doing next is even more difficult than admitting that change is necessary. That said, it should be uncontroversial, given what we have been exploring, to say that the future of technical communication is fundamentally more technical, more fully and continuously integrated into the associated product lifecycles, and much more actively engaged in collaboration with lifecycle stakeholders than it has been in the past. And if there is any agreement that authority and accountability, not to mention scalability and sustainability, are important in the brave new world of technology-mediated everything, then I would submit that Information 4.0, together with its substantiating and supporting Content 4.0, will become more and more central concerns for the modern enterprise instead of being peripheral which is where they have hovered until now. This means that the golden age of technical communication lies before us. In this we should understand technical communication as both a technological undertaking for making information fundamentally more informative to both people and machines, and as a facilitating practice that builds a grounded understanding of the technology we use, the technology we create, and the emergent technology that we will increasingly rely upon. This is a rather large and sweeping idea to end on, so it seems fitting to hand things over to William Blake, the poet of the original industrial revolution, who can help us...To see a World in a Grain of SandAnd a Heaven in a Wild Flower Hold Infinity in the palm of your hand And Eternity in an hour - William Blake (Auguries of Innocence, c 1803)EpilogueSince the end of 2016, there have been some further explorations that sought to map other concepts into a similar four-phase structure. One of the most illuminating was the idea of Consumer 4.0 and it was illuminating because it helps to shed light on why Content 4.0 needs to be the way it is. Essentially, Consumer 4.0 is notable for introducing the idea of Conducers - customers who take a product, and even its constituent parts, so that they can create a new product. This lines up with the idea associated with Content 4.0 where there are many publishers of the same content - you don t know who will be publishing the content you are preparing or how. For this to be viable, you need to provide these Conducers with content objects from which they can fashion new experiences, sometimes leveraging the behaviour that you were planning, and preparing, for and sometimes extending or overriding it. Then as we entered 2018, and prompted by the strangest of predilections, I realized that I could not leave things to stand as a six layer model. There would have to be seven layers. I then realized that I was in fact missing my original layer from this picture, one that looked at the nature of business organizations and that arrived, in its highest state, as the Fractal Enterprise. And this is fitting as I had commenced these inquiries with a conference keynote talk on Managing Knowledge in the Fractal Enterprise in 1999 - almost 20 years ago. At the initial level, Business 1.0, we find cottage industries, made up of small scale organizations - often rooted in a family or community. The next stage was the multi-owner company, emerging in the early modern period as a mechanism for marshalling more formidable resources and distributing risk. In recent times, there has emerged the virtual company where expertise and resources are drawn together from across an integrated supply chain where each participating company has been optimized to a specific role. Finally, and in conjunction with the other occurrences at the 4.0 level, we find the Fractal Enterprise as something that is only now emerging and whose shape is neither clear nor static. Below is a set of slides, together with explanatory notes, that (together with this post) represent the final state of my handling of the concept of Content 4.0 in 2016. These slides were progressively refined for a series of deliveries and discussions including a webinar given for Thought Leader Thursday at The Content Era, a keynote TED-style talk at the CIDM Best Practices Conference in Santa Fe, a keynote address at Lavacon Las Vegas, a session and group discussion at TCWorld 2016 in Stuttgart, and an uproarious opening debate at the CIDM DITA Europe conference in Munich.Buried in this slide deck (see slides 43 44) is an idea of human-cyber-physical systems which I intentionally kept out of this post but to which I will return again. Information 4.0 for Industry 4.0 (TCWorld 2016) from Joe GollnerAs a final takeaway, here is a single sheet image that lines up all of the historical trends that we have been exploring and comparing. With the magical importance of the number 7, one more layer has been added - business - with the 4th phase being associated with the Fractal Enterprise , which takes me back to a keynote presentation I did in 1999 (which in turn makes me feel very old). This will be a short detour back in time. Back to a keynote address that I gave at Lavacon 2014 in ever-enjoyable Portland Oregon. That talk then leapt further back in time with case studies drawn from across a 20 year period (25 years if we are being honest with ourselves). The purpose for this retrospective was to unearth the secrets to success in content initiatives and in particular the secrets to successes that have stood the test of time.Below is a recording of my full presentation, with both slides and arm-waving, so what I will do here is summarize the key points from the seven (7) architectural secrets to success (aligned with the lessons of lean manufacturing) and the three (3) managerial secrets to success. This makes for a nice even ten (10) secrets to success with your content (management / modernization) initiatives.The Seven (7) Architectural Secrets to SuccessGet Together. Enable collaboration across business and disciplinary silos by applying concurrent engineering tactics. A high-quality outcome depends, almost entirely, on engaging and integrating all of the stakeholder perspectives that apply to any given undertaking.Try Before You Buy. Genuinely try out the technologies that you are considering as candidates. This is fundamentally different than watching demos and attending user conferences . How you acquire the content technologies that you will deploy is in fact more important than what you acquire. In large part this is because technologies are not neutral (they are never neutral) and each technology will impact how you do your business. It is folly, demonstrated daily within organizational Information Technology (IT) shops, to try to nail down the business requirements and then shoe-horn them into a technology. IT shops may be addicted to this form of error (as it does guarantee life-long employment), but content professionals need to do better.There s an App for That. Leverage automation aggressively and systematically. Most organizations under-utilize what content technologies already exist. This means that they are generally working much harder than they need to. This also means that they are not delivering as much value as they could be. There are decades of experience rolled into the content technologies that are available today. You don t need to re-live these past experiences or to attempt to re-invent what has already been developed.Throw Yourself into the Numbers. Engage in measurement with the specific purpose of improving how well you are performing and how well your content is performing. The content business has historically under-utilized measurement and this is a shortcoming we need to fix. Think Small. Break content and systems into their smallest viable units. Use these to build up higher-level constructs which will then be far more efficient to manage and far more scalable than the older monolithic approaches. Once content has been modularized it becomes possible to reorganize that content in different ways than have been done in the past. It is remarkable what comes to light when we do this. Going further, this strategy can be used to redress the universal tendency towards barnaclization. Smaller pieces engender content structures and systems that are intrinsically more maintainable - simply because you can get in there and fix them.Walk the Talk. Following the line of thinking from Think Small, we look to build our systems and practices using articulated processes. This is fundamental to what distinguishes lean manufacturing from other models and illustrative of how content technologies can be leveraged to improve how ever enterprise does what it does. It is essential for any effective and sustainable complex system that it be open to inquiry and intervention. This means that stakeholders can interrogate the inner workings of the system, understand it, and modify or fix it as necessary. This applies generally to technology management but also quite specifically to content management systems (which are intrinsically complex).Build to Last. Systems developed using the above principles will stand the test of time and will deliver continually accruing benefits. Many (most) systems do not achieve this desirable state and instead become sinkholes into which funds and energy are sucked. It should be clear which of these two possibilities we should be working towards.The Three (3) Managerial Secrets to SuccessSpeak Management. Managers of communications groups or content management projects rarely seem to have much prior management experience - or experience talking the lingo of management (including numbers and money). We need to change our ways so as to engage executive stakeholders on their turf and in their terms.Lead the Way. Introducing new tools and processes means introducing change - and change is always hard. This is where leadership is called for. Real leadership. And real leadership means taking full responsibility for the outcomes so that your team has the space in which to take risks and the time it needs to make the necessary changes.Follow the Content. Follow the content back to all of its sources and forward to all of its uses. This means applying the effort to really understand your content assets and to ensure that all of your processes and technologies respect that content for what it is and what value it really delivers.The video of this keynote runs for almost an hour but it covers an awful lot of material. So I do recommend it. Looking back at it after a couple of further years of work on my thinking, I am gratified to see that I would not revise anything in these ten steps.LavaCon 2014 Virtual Track Day 3 - Joe Gollner from LavaCon Conference on Vimeo.You can also just flip through the slides, sans storytelling arm-waving: Secrets to Content Initiative Success (Gollner Lavacon 2014) from Joe Gollner In the postIntegrated Content Management, we dug deeply into the integrated, and integrative, nature of content. One of the things we took away from this exposition is the recognition that the real power of content lies in the fact that it can be used to build bridges between an enterprise and its customers, and between the business silos that exist within the enterprise. Content can perform this special function because it strives to be truthful and as such exists below the level of politics and spin that characterizes so many of the information exchanges that typically obstruct our efforts at collaboration.In a nutshell, content can be used to achieve things within and between organizations that cannot be efficiently, or effectively, achieved by any other means. Essentially we can use content to build genuine and durable connections between silos and between an enterprise and its customers.This sounds like an overweening statement, my very own Icarus moment, but it is not as outlandish as it might appear. Think of it this way. If we get down to the nuts and bolts of what an enterprise is trying to do, down to the "truthful representation of what you are up to", then you are suddenly working with details that everyone in an enterprise will recognize and even respect. This will be true for people from marketing, engineering, management, and even finance. This does sound a little far-fetched because people from these different business units (not to mention from external suppliers) will literally speak differently, work differently, and judge things differently.However, over the years I have been repeatedly surprised when I see people from absolutely opposite ends of the disciplinary universe actually drop their prejudices and say things like "you mean to say that if we did this we could..." or "if we did this, then you would be able to..." and in each case identifying how things would be easier for someone else and accordingly better for the end user and customer. When this does happen, we are usually looking at the details around a product or a service or an initiative and we are doing so by looking at content assets that are clearly not quite ready to be released to a user or to the public. And when this happens, part of the discussion inevitably turns towards how the content can be pulled into, displayed, and used differently by the stakeholders around these different business units. The fact that content can be channelled into all of the necessary representations, which each party can specify for their own needs, seems to provide everyone with some breathing room. People lower their guards and collaboration actually happens.It has also been a source of incredible affirmation to see people in massive organizations, racked as it were in the worst forms of bureaucratic barnaclization, suddenly come to life when they can see a path towards actually making things better for the customer and for the users of their products or services. I will never forget one project where a collection of school teachers surrendered part of their treasured summer vacation in order to assist in the conversion and validation of curricular material that was being salvaged from an ancient and pernicious format. It flabbergasted me then and it still flabbergasts me now. XML conversion and quality control is used as a punishment in some cultures. But their rationale was crystal clear - the vision of an interactiveversion of the educational curriculum that could be used by teachers, parents and students, was simply so compelling, and so close to why they had become teachers in the first place, that this was asacrifice worth making.As I have observed elsewhere, good content runs deep. It runs below the waves, below the turbulencethat the politics stirs up. It runs close to why people decided to do what they do.So what do we do with this realization. If content practitioners have in their hands the materials, tools and communication skills to build these connections then where so we go from here?This brings us to the topic of Content Leadership. It is admittedly a somewhat unusual mash-up of words. However,given the picture we have been painting of integrated content you can see what "content leadership" will beabout. It is abouttaking the initiative as professional communicators to reach out to people in different business silos and to start the often hard work of collaborating on shared content assets. There is no disguising the fact that this is hard work or that there will be some rough times. This is why when we talk about leadership, we are invariably talking about taking risks, accepting responsibility, and even making sacrifices to help other people and to make things better. Leadership is a topic all of its own, and for good reason.I have made plenty ofacerbic statements about barnaclization as a debilitating process and I am on record as declaring thatbusiness silos are not something that can ever be eliminated even if thatwas something advantageous to do (which it is not). So what I think I am saying is that our efforts to awaken and enliven the potential of good content to build vital links between silos and stakeholders we can turn these natural forces in a constructive direction. Rather than crippling our vesselthrough the uncoordinated growth of isolatedbusiness units, we can steer towards a coordinated growth that creates something new, something durable (although obviously not invincible), and something that fosters innovation instead of impeding it. The picture that emerges is one of a coral reef. An ecosystem that grows and sustains new life. This is a much more endearing picture than a boat that has been rendered immobile by the encrustation by barnacles.Using an entirely different palate of analogies, I approach this topic in this short video. It is a recording of my keynote address to the Lavacon 2015 conference in New Orleans. The title of my talk was "The Dark Arts of Content Leadership". Not too long ago, in early 2015, I asked the question Would the real Content Management please stand up? Going back several years earlier to 2009, I had posted a meditation on The Trials and Tribulations of Content Management. Between these two bookends, I have been on something of a quest, a quest that a good many people have joined in on by contributing comments and asking questions. To all these people I owe a heartfelt thanks in no small part for their patience as I ventured this way and that trying to figure out why content is so special and why it is so important for us to help organizations to manage it well. This post sums up where my adventures have landed me and shares why I think I may have finally solved the riddle of content.Recalling my post from early 2015 that set out to find the real Content Management, I will point out that I concluded this search by declaring that what we are really seeking is integrated content management . In a keynote speech at the Information Energy 2015 event in Utrecht, I spent time digging into Web Content Management, Learning Content Management, Enterprise Content Management, and Technical Content Management. I did so to find what it is that unites these disciplines and how these disciplines might be integrated so as to work together instead of at cross-purposes. This led to my initial use of the phrase integrated content management - the need to combine the different content management disciplines into a single and coherent whole.Of course it did not take me long to move beyond this point and to dig more deeply into what integrated content management might mean. This led me to focus in on integrated content and its management as the real question. If we could establish what integrated content is, then we should be able to sort out what its management should look like and therefore how the various content management disciplines, and technology products, could be constructively coordinated. And in turn it did not take long to see that what we were really doing is exploring the integrated, and integrative, nature of content itself. So over a number of years, I had basically come back to the conclusion of my 2009 post on the Trials and Tribulations of Content Management - to the call for a clear, shared, and well-grounded definition for what it is we are trying to manage, with that being content .I am going to use snippets from some of my 2015 presentations to provide a survey of where my various inquiries have taken me when it comes to what it is we mean with the word content and why getting to this understanding was (at least to me) so important. Now it may be true that I am the only person on the planet that has been taking any comfort from these discoveries. But I do take solace in having arrived at them and this means I can t help myself and just need to share them.To begin at the beginning. Let s establish the organizational context within which we must think about content. This context is the concept of the Enterprise . This actually takes me back to a presentation from 1999 called Managing Knowledge in the Fractal Enterprise , a fact that helps to illustrate how long I have been mining for answers to these questions. This short segment comes from a talk I did at TCWorld 2015 on Practical Steps towards Integrated Content Management (full presentation from Slideshare included below). In understanding Enterprise in this way, as a dynamic and purposive assembly of organizational resources and capabilities, we immediately get a sense for what content must be in order to live and thrive within this context. It must be portable and processable - something that can move between different systems throughout its lifespan. As I state in this segment, one conclusion that emerges from this is that open standards are central, not peripheral to the nature of content and therefore are fundamental to content management .The next stopping point in this inquiry is to grapple with what we mean by the word information and therefore what we mean by information management (by the way, the primary focus of enterprise content management systems ). The domain of information, and of mis-information, is a fascinating one. This is the domain where we can see, and begin to understand, the true nature of organizational silos. What we come to see are the swirling whirlwinds of information transactions that continually unfold within and around organizations. And what we notice is that many of these information transactions are intended to convey control details about that is going on, or about what organizations would like to see happen. We also notice that a great many of these transactions are intended to define and defend the boundaries of organizational silos - forming as they do disciplinary and operational echo chambers. In this, we see a picture of barnaclization, of individual innovations and contributions actually working against the long term viability of the enterprise (see my 2008 post on the Barnaclization of Systems). And finally we notice how many of these transactions are really intended to be deceptive, to be mis-information, to be spin . It is an overwhelming picture. A bit like the head of Medusa, if we were to see the full sphere of organizational information transactions all at once we would surely turn to stone. One reason we should start to think about content as something seperate from the fog of information is that doing so gives us a chance to manage (using abstraction among other things) the information domain more effectively than we do currently. This brings us to the climax of the inquiring into content - to the point where we establish a clear definition for the word content and for how it relates to our understanding of the word information . Interestingly, perhaps, this core definition of the word content is one I framed on the same day in 2009 (in a post entitled The Truth about Content ) that I wrote my piece on The Trials and Tribulations of Content Management . Here is a more recent encapsulation.In this most recent encapsulation, a few declarations stand out. One is that content strives to be truthful . Another, and one that flows from this first declaration, is that content is an asset and it is always an asset . These are important insights as they shine a light, indeed a very bright spotlight, on why content is so valuable and why it so merits effective management and use. So what is it that elevates content into being this uniquely special asset that stands as a truthful touchstone within an enterprise? The answer to this lies in its integrated, and integrative, nature. Content pulls together grounded details from across an enterprise and does so in a way that maintains their context and infuses them with rhetorical momentum that aligns with what the enterprise is trying to achieve, its goals. In assembling the details, together with their contexts, content becomes a uniquely high-value asset that can be used to make durable connections between disciplinary and operational silos. Good content becomes the basis for building better connections with the customers and users and these connections actually reflect back into the organization in a way that reinforces the connections being made between silos. At this point, it might be helpful to invoke the help of an example from history.What we see in the work Luca Pacioli in the late 15th Century, with the introduction of double-entry bookkeeping (a key milestone in the history of modern accounting), is relevant because what we really see here is triple-entry bookkeeping . The third entry refers to the text that each journal entry exhibited. I argue that this changes the nature of the accounting journals from being purely an assemblage of data items to being integrated content that actually explains, and places into context, those data items. The completeness of these entries, and their understandability by just about any stakeholder, is what is most interesting, and revolutionary, about these examples. These entries showcase the true nature and power of integrated content. It points us towards how integrated content can be used to fundamentally improve how enterprises work by grounding subsequent information transactions on a bedrock of truthful and meaningful details. So let s look at the idea of integrated content even more closely.Once we come to see content in this way, as an asset that integrates a variety of inputs and that enables specific types of grounded communications between silos and between an enterprise and its external stakeholders, then we gain a better understanding of how content evolves within a content life-cycle model. We come to see what it is that we want from the content technologies that we will need to put into place and how those technologies will need to handle content as content - as complex composite artifacts that must be respected as such. We come to see how our work with content assets must maintain its responsive orientation towards how the information products they engender are received - in how well they work for the users trying to do something.Although this post is already too long, it has been scrambling through a lot of material, and a lot of ideas, with the consequence that I have probably failed in communicating why I think seeing content in this way is so important. Essentially, I contend that if we see content as a complex composite artifact that brings together grounded details from across the enterprise and that can be used to produce information products that external stakeholders really need, then we understand why content is such a unique, special, and valuable asset. We can see why it merits the attention of a specialized community of practitioners and calls for a vibrant marketplace of content technology providers. In effect, this contention is important to me, personally, because it helps to justify, somewhat after the fact, the time, effort, and money I have invested in this field. If I am totally honest, I will leverage this contention to at least explain the major sacrifices I have made over the years and for which I have yet to fully atone. All this to say, content is more important than any of us fully appreciate.So this brings us back to the topic of Integrated Content Management. Once we have established what content is, and brought to the fore its integrated and integrative nature, then we can see how the different content management disciplines and tools can be themselves integrated. If we put in place the infrastructure to genuinely manage our content assets as content, we can then produce the types of enriched information resources that the more mainstream content management systems (web, learning, and enterprise) can leverage to address the very real business needs that they are each designed to handle. And when all of these wheels are working together, the real winners are the enterprise and its community of stakeholders, its staff, suppliers, customers, product users, and yes even shareholders. Practical Steps Towards Integrated Content Management (Nov 2015) from Joe Gollner Believe it or not, there was a time when we did not talk about content. At least not in the way we do today. To some ears this will sound decidedly odd. To others it might even sound outrageous. But it is neither. I would like to suggest that the concept of content that we now associate with management and publishing has been shifting under our feet and that these changes should help us to define the term more precisely and to wield it more effectively.We can start by turning the clock back a few decades and consider how we once used the word content. Whether we were confronted with bewildering stacks of military manuals or unending shelves of historical texts, we would talk about digging into the contents of this or that publication. Even while we fussed over the pagination or the method of printing that had been used, we were certain that the real value lay elsewhere – somewhere within the content being expressed. In these cases, we thought of content as the meaning being communicated by the publication – its intrinsic value. More recently, in an organization buried under mountains of legal dossiers, a colleague of mine declared that what we were trying to do is break the documents open and to unlock the content that lay within.This use of the word content is completely in line with its historical sense, coming as it does from the Latin term contentum or that which is contained. Content is what we seek to extract from within document containers, and then to apply in a given situation or to reuse in another context. We are in fact highly adept at finding the content within published documents. With remarkable facility, we scan documents, interpret the supplemental meaning provided by the layout, and identify what content is relevant to our needs.I would like to argue that we need to define the term content more precisely. Specifically, we need to start using the term content in a way that considers content separately from its many potential delivery containers – that thoughtfully abstracts content away from the organization and format of any one publication. I would also argue that it is only by enforcing this separation that we can design and deploy content management and publishing solutions that are genuinely effective, scalable, and sustainable.Now let’s move the clock forward to the very early days of the web. One thing that stands out about the creation of early web sites is how it forced us to see content in a new light. Once again we were digging into these documents in order to grapple with their contents. But this time it was different. This time we were not only trying to interpret what the documents meant. This time we were trying to extract that meaning so we could make it accessible in a new medium. On the most basic level, we became very conscious of lots of nitty-gritty minutia like character encoding so that it was possible to display the files correctly on computers over which we had no control. On a higher level, we immediately set to work on finding ways to present the content in compelling and effective ways through the relatively new phenomena of web browsers. It was an exciting time because we had a chance to explore a new publishing medium and what promised to be a new way of doing business.Now during the times we were working on these early web sites, we continued to have a community of users, and indeed a very large community of users, for whom the printed output was the most important deliverable. So it became necessary to maintain legacy publishing processes even while we were bringing into service new web delivery channels.We were not entirely happy about this circumstance. After many years (decades actually and sometimes even longer) of progressive refinement, the organization and format, or as we will call it the layout, of printed publications had become subject to layer upon layer of onerous control. It was not lost on us at the time that one of the things that made the web so attractive was the chance to break away from the stifling regime of print documents that had come to prevail in many organizations. We were at the head of the line when it came to championing the new freedom that the web seemed to offer. The old paper regime had breathed its last and by bringing web servers online, usually by side-stepping obstructions erected by old school information technology groups, we would usher in a brave new world of promiscuous collaboration and unbridled innovation. To lean on a gratuitous reference to The Lord of the Rings, we were all of us deceived. The power of the paper regime, and the empire of documents, could not be so easily undone.It did not take long for us to discover why many of those print publishing rules had existed in the first place. In all too many circumstances, these legacy layout rules had evolved over years of experience and with the intent of serving very real needs. Often the layout rules were designed help save precious time for people accessing or handling the publications. In some cases, the rules had evolved in order to ensure user safety. In still others, the rules had become subject to exacting legal requirements whether through judicial precedence or through legislation. While it is true that these publishing rules had become overgrown and top-heavy, there were an infuriatingly large number of them that simply could not be discarded. And this was not an isolated occurrence, limited to just one industry sector. This was as true for academic reference texts as it was for military manuals.So the realization sank in that, no matter what else we set out to achieve, we would need to maintain a print publishing capability that would sustain the types of print deliverables that legacy business processes demanded. It also became clear that the same pressures that had produced the print publishing rules would come to bear upon our web publishing efforts. And this is exactly how things turned out. What was most surprising, now that we can look back, was the speed and ferocity with which organizational obligations began to collect around web publishing and how quickly we responded with our own bevy of guidelines and publishing rules for the web.We knew what we needed to do but, of course, we immediately tried to find some shortcuts. Perhaps there was a way to migrate content directly from the print-oriented representation into a web-ready form. Perhaps doing so would allow us to import, holus-bolus, layout conventions from the print world and thereby help us to get around needing to define and validate new publishing rules for the web that would meet the same business objectives. We knew better but we had to try anyway. There was also a lot of pressure to maintain the print update cycles even while we brought new channels online. And naturally there was a desire to see that anything delivered via the web would be synchronized with what had been released in paper. All this meant that time was of the essence and a shortcut for streaming content from print onto the web, if it worked, would have been the answer to many prayers. These experiments turned out pretty much as expected in that they failed miserably. So we returned to what we knew was the necessary answer. We were forced to turn our attention to true multi-channel publishing from a single, authoritative source of content. We were be forced to look at the content as content.In the latter part of the 1980s, there were a number of industry sectors (including the military as well as academic and legal publishing) where the march toward the multi-channel publishing had begun a few years earlier with the adoption of the Standard Generalized Markup Language (SGML). In these sectors, publishers had been forced by the escalating complexity of their business to radically change how they prepare, manage and publish their content. In a predictable number of cases, a key driver for the adoption of SGML was enabling the distribution of publications through multiple channels with an emphasis on electronic delivery. This move towards SGML was also driven by a need to establish searchable stores of content as a more practical way to handle swelling volumes and long-term access requirements. The people working in this field at the time did have a sense that, in their various experiments, they were exploring the future of publishing.The introduction of SGML into publishing processes, it turned out, necessitated a lot of changes in how we viewed and handled publications. The most profound of these changes was the fact that applying SGML, if done correctly, would force us to thoughtfully abstract the content away from the delivery formats that were appropriate to any one publishing channel. As an illustration of how difficult making this change was, a disturbing number of the early SGML applications were in fact typesetting specifications re-expressed using angle brackets. Even in these cases, however, there was a growing appreciation for the fact that the way content is prepared and maintained need not be identical to how it is delivered for use and that automation could be deployed to facilitate high quality publishing in order to transform content assets into useful information products. Fortunately there were also SGML projects that really did pursue the new goal of representing the content as an asset that existed separately from its containers, and that maintained an arms-length relationship to the organization and formatting associated with various publications. Invariably, these projects also discovered that content assets needed to be managed in a modular way so as to facilitate reuse and referencing across collections that would quickly escalate in size, volatility, and complexity. I would argue that it is within this latter type of SGML project that we first got a glimpse of the true nature of content.It is worth recalling that an early SGML-based publishing environment, and in fact the system that was used to publish the SGML standard itself in 1986, operated at the high energy physics laboratory at CERN where Tim Berners-Lee was, in parallel, hatching the web. This is why the Hypertext Markup Language (HTML) was framed as an SGML application. This is also why HTML exhibited the common compromise seen in early SGML projects in that it was largely a format-oriented application of SGML. Far from being a problem, the ability for just about anyone to create a web page using simple formatting markup turned out to be one of the key reasons the web took off as it did. It seems a little odd to say, but part of the web`s success stems from the fact that it ignored the content and provided instead a new delivery channel for any content that could be poured into a web page and given basic formatting tags. But the fact that HTML draws its roots from SGML also points to a latent capability that lies buried within the tissue of the web – the capability to handle content separately from its presentation layout and to thereby to enable, for businesses, far more scalable publishing models and, for users, more responsive web experiences.Through this series of events one very important thing was becoming clear. For a number of interrelated reasons we were being forced to think about content differently than we had in the past. With the emergence of the web in particular, we started to think about content as the conceptual material that we would plan, prepare and manage in a way that was different from any one of the forms in which it would be delivered to users. This content, if done correctly according to this line of reasoning, would allow us to efficiently produce all of the publication types that would be needed – including those of which we were as yet unaware. This lesson, if it needed reinforcing, has been driven home by the new demands being introduced by the mobile revolution that is unfolding around us.Somewhere along the line in this journey from print publications to mobile devices, I began to define content in a somewhat unusual way. I came to use the term content to mean “potential information” and, apart from being wilfully idiosyncratic, this definition allowed me to make a sharp, but essential, distinction between the reusable content that we would want to manage as a long term asset and the many transactional forms that the information might take as it is printed, served to a web browser, or delivered to a mobile device. Information, under this rubric, is understood as an action and one that should be judged on whether or not it is effective. Information transactions become the venue where the potential value of content assets is realized.Looking more deeply into the content, we started to build on our appreciation for the true nature of content. By studying the publishing rules associated with legacy publications, and coming to grips with the business objectives and obligations tied to those rules, we came to understand that in order to fully abstract content away from the organization and formatting of publications we needed to locate, understand, and represent the supplemental meaning that was being expressed in those publishing rules. If we were going to establish appropriate layout behaviour in radically different channels then it was not just the text and media assets that we needed to manage. We needed to manage the relationships between the text and media assets that reflected and respected the logic that governed how content assets would interact during the delivery of effective information experiences. Essentially everything that will be necessary to facilitate effective information events would need to be managed as the content. This is what “potential information” really means. And this is why there is a unique and inescapable role for content technologies - for tools that handle content as content, and that can transform content assets into information products.Warnings in technical manuals provide a useful illustration. Anyone who has had to open a manual in order to fix something will recall seeing “warnings” that caution people to avoid doing certain things. Within the military, our favorite off-color example was a warning that read “Do not look into the laser with remaining good eye!” As this example showcases, the placement and timing of a warning is important. There is a logical connection between a particular warning and a set of steps in a procedure. This connection must be made explicit because it is rather obvious that the warning needs to be made prominently visible (and even audible) before someone starts a task. It is also important that the warning needs to remain visible until the relevant steps are completed and the danger has passed. The presentational design criteria that will apply will be different for a printed loose-leaf technical manual than it will be for a portable maintenance application that runs on a tablet. But the goal of safety and the logical connections between a warning and a set of tasks will remain constant.The logic behind publishing rules, such as those governing these warnings, is in fact an intrinsic part of the content and this logic must be incorporated into any representation of the content that hopes to be able to credibly reproduce effective publications in all of the channels being addressed. Hopefully the example of safety warnings helps to illustrate why thoughtfully abstracting content away from the publication layout is not as simple as might initially appear. And hopefully this example helps to explain why thoughtfully abstracting content away from any one publication is essential if we wish to achieve our strategic business goals by effectively addressing emergent publishing channels.Although the definition of content as potential information, and this sharp distinction between content assets and information products, has earned me more than a few rolling eyes over the years, it has proven to be far more indispensable than I would have ever imagined. With this distinction in hand, it becomes possible, even natural, to think about content in two separate, but obviously interrelated, ways. One way asks about how best to design and manage the content assets so as to be ready to support many different publishing channels. The second way directs attention to the processes whereby content assets will be assembled into, and published as, information products that achieve both organizational goals and individual needs.In fact, over the last 25 years, it has been my experience that truly successful content management and publishing solutions have only been possible when we have rigorously applied this distinction between content assets (as potential information) and information products (as contextualized information events). And whenever affordability, sustainability, and adaptability have been given any attention at all, this approach is the only one that consistently delivers the desired results. By staging content assets in a way that is thoughtfully independent of all publishing channels and that makes explicit the logical connections between content assets, we have been able to optimize how we acquire and manage that content while simultaneously optimizing how that content is published across channels and in response to always-changing user needs. We have literally been able to have our cake and eat it too.Epilogue At the Best Practices conference for the Center for Information Development Management (CIDM), in Saint Petersburg, Florida, I gave a short TEDTalk-style presentation on this subject. The short talk did not dig into all of the details addressed in this post although it did apply more attention to some themes. Specifically, the presentation did a better job of approaching, and illuminating, how escalating complexity drives the move towards content modularity, and how content technologies have emerged as a distinct technology discipline that really does set out to handle content as content. The Birth of Content (JGollner CIDM Best Practices 2015) from Joe Gollner PrefaceThis post attempts to define intelligent content in a new and hopefully fresh way. While still compatible with previous efforts to define intelligent content, and to describe its utility, this attempt consciously adopts new language in the hope that doing so will provide practitioners with some novel tactics for explaining the nature, purpose, and value of intelligent content.This post emerged in response to, and in conjunction with, discussions that occurred in early 2015 between Ann Rockley, Scott Abel, Charles Cooper, and Joe Gollner on the topic of how intelligent content might be repositioned so to resonate with a wider audience. The imprint of those exchanges can be seen throughout this post.For another perspective on the subject, see the whitepaper “The Emergence of Intelligent Content: The Evolution of Open Content Standards and their Significance” (2009).Defining Intelligent ContentLet’s jump right in and see what a new definition of intelligent content might look like:Intelligent content is digital, data-driven, and dynamic.Digital in being designed and built for a connected world.Data-driven in being meaningful to both people and machines.Dynamic in being responsive to different user needs.Today, organizations create, share, and publish information in many ways. At any one time, these organizations will be engaging their audiences with email campaigns, social media interactions, digital catalogues, online information sessions, advertisements of various forms, educational and promotional videos, user support information, and yes even good old fashioned printed manuals, collateral, and books.In order to do all this, these organizations need to prepare their information content differently than they have in the past. It is no longer practical to work in channel-specific content creation tools and then spend a lot of time manually reformatting and tailoring that content to fit other channels. Today, they need to prepare their content in a way that will let them do everything they need to with that content and to do it as quickly and efficiently as possible. Today, they need intelligent content.When we use the word intelligent to describe this type of content, we are keying on the definition of intelligent as “the ability to acquire and apply knowledge” (Oxford English Dictionary). Intelligent content, then, refers to how organizations articulate, share, and leverage what they know about their business, products, and customers. It’s how they create, manage, and publish this knowledge when they don’t know beforehand what format, or sequence, the knowledge will be most usefully communicated in. So let’s take a closer look at what we mean with each of the terms we are using to define intelligent content: digital, data-driven, dynamic. [You can also take this post to go on Slideshare.]Digital ContentWorking with computers, of course, is not particularly new: the desktop publishing revolution took place way back in the 1980s. Communicators quickly became adept at using various computer programs to prepare and layout content. Production processes also moved more and more to being digital, and today non-digital processes are a rarity. But all of this is only the first step into the digital domain.Ever since the World Wide Web appeared in the early 1990s, there has been a growing awareness that information content needs to be distributed online as well as in print. And in recent years, the explosion in social media channels and in mobile devices has flipped the publishing business on its head. Today, content must first be prepared for a mobile user – and this mobile user must be equipped with the ability to interact with, and to personalize, the content itself. This leads inevitably to a “digital first” mode of thinking about content assets.In being digital, intelligent content is optimized for automation. Automation is used to locate content, filter it, sequence it, and format it to suit the highly specific needs of a given user, in a given location, at a given time, and with a specific objective.Think of a buyer for a typical business who has been given a deadline to find and purchase a specific type of device. The buyer is away from the office and needs to do this research and make a selection using a smartphone. The product supplier whose content is easily found and viewed on this smartphone will be the one that gets a closer look. And the supplier whose product catalogue provides content that is clear, complete, and useful – and which answers the key questions the buyer might have – will be the one that wins the business.As we can see from this brief example, the information that a product supplier provides needs to fit into an online, mobile, and connected marketplace. It is also important to highlight that there is probably more to this story: perhaps the buyer needed input from an engineering team in order to finalize the purchase decision. The buyer locates a technical specification for the device, available online as a carefully laid-out Portable Document Format (PDF) file, filled with tables and illustrations and supported by a three-dimensional model. If the buyer can send this PDF and model to the engineering team along with a link to the user documentation that is available online, then this part of the buying cycle can be kept short and sweet. If these stakeholders like what they see, then the green light will be given and the first purchase will be made. We can only hope that the product itself lives up to the expectation of quality that this supplier’s content has established.So this is what we mean when we say that intelligent content must be digital. Throughout its life, intelligent content must be handled in a way that facilitates its publication, maintenance, discovery, and use by leveraging automated processes. This automation will further make it possible for teams of content experts to collaborate on the design, creation, and publication of the content – and to ensure that it is continually synchronized with the products and services of which they are a part. This points us to the next dimension of intelligent content, to the fact that intelligent content must be data-driven.Data-Driven ContentIn an earlier time, content was handled as something separate, self-contained, and isolated. All eyes were on how the content looked when published. Think of a team of technical communicators preparing user documentation, for example, with proprietary layout tools into which they copy and paste essential product details. Through this painstaking and time-consuming process, product part numbers become table entries and feature descriptions become list items. But no matter how good the resulting publication looks, you can just tell that there is going to be trouble when the time comes to update some of the product details. The copying, pasting, and formatting will need to be done again – and hopefully done correctly. For more complex products, ones associated with numerous replacement parts and troubleshooting steps, this update exercise becomes both expensive and frustrating. And this exercise is not limited to the technical documentation team. Down the hall in marketing, the very same thing is happening. Even worse, these two groups share each other’s content, again using the time-honoured practice of copy and paste. Before long we find ourselves in a situation where the marketing materials no longer jive with the user documentation, and neither line up with the details provided in the product catalogue or with the product itself.Clearly this makes no sense at all. Organizations need their content assets to be data-driven. They need their content to be intelligent. This means that the content will incorporate the data resources that an organization maintains about its products or services, and will do so in a way that maintains an active connection with the master sources of that data. If the content includes part numbers then it needs to include the actual part numbers in a way that can be automatically kept in sync with the product design. If certain procedures in the user documentation are helpful as illustrations in a piece of marketing collateral or in a training module, then it should be reused in a way that can be managed and updated. This is a large part of what we mean when we say that intelligent content must be data-driven. It is driven by updates from live data sources.Of course, there is more to intelligent content being data-driven than this, important as it is. In being data-driven, intelligent content comes to life as a data source in a way that was never possible with those large and impenetrable document files that we all remember so fondly. Intelligent content is consciously designed to showcase its structure and the meaning of its components in such a way that both people and machines can make sense of and can do something with. So rather than a proprietary file what can only be read by the desktop publishing software that created it, and then only insofar as is needed to print out pages, intelligent content can be read by anyone and can be prepared for use by any piece of software. If it isn’t obvious enough already, we will draw a line under this and stress that this is radically different. It is also a major advance over what we used to do.So you will notice that we tend to talk about intelligent content in ways that are very similar to the way the information technology professionals talk about databases. We define the structure that the content will exhibit, the relationships that will be observed, and the metadata that will be applied to different components. To recall an earlier example, it may be important that part numbers are captured as part numbers, and that features descriptions remain identifiable as such instead of becoming simple list items. This way specific details from the master product database can be pushed into the documentation at the right places, and a technical communicator can be prompted to write feature descriptions and troubleshooting procedures following the applicable style guidelines. Looking downstream, validation software can check these details against the applicable rules and the formatting software can render them in the right way for each channel. And the mobile application used by field technicians can leverage the part numbers to issue an order to the inventory system and the diagnostic wizard can walk the user through the right troubleshooting steps for a given situation. When both people and machines can understand your content magic happens – and this is no exaggeration.If that were not enough, there is even more. In being data-driven, intelligent content becomes a resource that can be tailored very precisely to fit whatever is known about a given user. When the data showcased within the content is compared to the data available about a user, such as location, device, and even activity, we get a result that lines up nicely with what the user will find valuable. Think back to the beautifully laid out publication or product catalogue that the technical communicators had produced and we notice just how important the layout became for people trying to find information. Users with questions needed to find what they were looking for by traversing the way the content had been laid out. What content was relevant to the question would be found essentially by reading the text. First, the user would start with the Table of Contents, or perhaps consult the index, and then, based on that guidance, the user would scan a specific section of the document for the right information. As you can tell, this is essentially a manual process. And it’s a manual process that is unbearably tedious when attempted on a smartphone and especially when you are in a hurry.In the age of big data, when so much is known about people and their activities, one of the better outcomes is that organizations can provide information that is individually prepared to fit peoples’ situations so they are in and out as quickly as possible. With all this data at our disposal, we will be increasingly able to answer people’s questions before they ask for them. So, again, it’s time to say goodbye to imagining that users want to click, scroll, and scan in the vain hope of finding the one detail that is relevant to them.With intelligent content, we can do so much better.Dynamic ContentContent that is both digital and data-driven is poised then to be highly dynamic. This means that the content can be adapted quickly and efficiently to exactly suit the needs of different users. It is fundamentally responsive, which is much more than simply adapting to different viewing dimensions. Intelligent content that is genuinely dynamic can be programmatically adapted to reflect specific product versions, to incorporate customer-specific details, and to take into account a user’s location and even background. It can be adapted to work optimally in different formats, themselves produced automatically. For electronic delivery channels, it can parcel out the details in a progressive disclosure experience – where information details are provided as they are requested or as they demanded by a situation, instead of simply being dumped on the user at the outset.A high quality print product can, with intelligent content, continue to be offered. And these print publications can be much better than their predecessors. For one, they can made very specific to what applies to an individual prospect or customer. No longer do readers need to guess what product version number applies to them, as most of us need to do when we open our car driver’s manual. This is a good thing but we all know that our primary interest has moved well beyond print. Electronic documentation that knows what version and configuration of the product you have is going to be infinitely more useful than printed manuals. Today, car manufacturers provide technicians, and even users, with augmented reality enabled tablets that literally show them what to do when there is a problem. Powered by intelligent content that is digital, data-driven, and dynamic, these tablets will tap into the experiences of thousands, if not millions, of other users in order to recommend the action that is statistically most likely to solve the problem. And as technician work through the problems, the tablet based application dynamically interacts with the online catalogue of spare parts to order replacements and automatically updates the vehicle maintenance records as well as the auto-shop’s work management system.Now we can see how intelligent content brings together these three key attributes of digital, data-driven, and dynamic in order to provide users with a fundamentally superior experience. This improved experience stretches across the complete customer experience lifecycle from pre-sales encounters through to customer retention efforts. Taken to its fullest realization, intelligent content comes to play a critical role in a complex, dynamic system that is the integrated product lifecycle. Intelligent content captures and incorporates the feedback of customers distilled from their experiences as users of the product. It thereby engenders a learning process that supplies the grist to the mill of product innovation. So it is that intelligent content not only improves with time (as opposed to degrading, which is a distinct feature of past approaches), but it leads directly to better products and therefore better organizational performance. It is in this way that it becomes possible to talk about content, when it is intelligent, as a strategic asset.We can also see that organizations benefit in other ways as well. The discipline and automation that comes with intelligent content delivers efficiencies and improvements across the board. These organizations see their update cycle times reduced to an absolute minimum, the consistency of their branding messages maximized, and their content positioned to rapidly reach new audiences in new ways. There are savings to be sure, for example by containing localization costs, but the bigger story focuses on what now becomes possible. In the battle to find, convert, and satisfy customers, intelligent content is one weapon organizations can no longer do without.EpilogueIn general, there is merit in exploring the words that we use and experimenting with how they might be defined differently. These exploits with Intelligent Content are primarily of value in that they help us to build up, and to share, a better understanding of the root concept involved which is Content. This reminds us that It s the Content,.... This blog post can also be viewed on, or retrieved from, Slideshare: One thing that you will often hear content strategists talking about is the need to break down organizational silos. Only then will content flow from where it is created to where it is needed. It would be more than a little ironic then to discover that the world of content strategy is itself respectably well-outfitted with its own silos. Rubbing salt into the wound, we need to confess that these silos are almost perfectly cut off from each other. The community of content strategists in one silo will be largely unaware that there are other communities of content strategists working away in parallel silos. When practitioners from one or another silo accidentally encounter professionals from another, more often than not, they cannot understand what these strangers are saying and sometimes they cannot even recognize their counterparts as fellow-travelers. What we see here are silos of disciplinary bias in their most perfect form.This should not come as much of a surprise really. Business silos are a ubiquitous phenomenon and they are not something we can simply wish away. You could be forgiven for thinking otherwise if you were to peruse any literature on the topic. Advancement usually follows a five to seven step program of innovation with the second or third step invariably being something like “erase barriers to collaboration” or “knock down the silo walls”. The only thing, apparently, that has blocked progress in the past was the absence of modern enlightenment and youthful innovation (a perspective we can call the TED Talk fallacy).In truth, there are many reasons why business silos form and so quickly re-form after this or that bold reform has swept through. Perhaps the most important among these reasons is something we can call “disciplinary bias” which refers to the fact that practitioners will build up a way of looking at the world that flows directly from what they deem relevant in order to do the things that they do. The truth is that the stronger practitioners are at pursuing their chosen specialization the stronger will be their disciplinary bias. So it is not something that organizations can simply do without, let alone attempt to remove. The question then becomes how to integrate and coordinate across these disciplinary perspectives. And this is why, quite rightly, content and business strategists return time and again to the challenge of tackling business silos.Out of a morbid curiosity, we can take a closer look at the business silos that have formed within the general field of “content management” – zeroing in on the silos that currently corral working content strategists into separate tribes. A scan of the content management landscape will scare up enough examples for our purposes. We quickly spot enterprise content management, web content management, learning content management, and technical content management. Each of these will also have internal sub-specializations. Within technical content management (which we might recognize as technical documentation management or technical information management) we will find practitioners focused on end-user help documentation who will have a different set of priorities than those who focus on the engineering information of interest to developers and integrators. Practitioners in any one of these groups will be familiar with different management tools and techniques, and will be driven by different measurements of success.It is interesting, for me at least, to look at what it is – exactly – being managed in each of these silos. This in turn tells us, rather clearly, why practitioners in one group haven’t historically spent much time or energy looking into what other groups are doing. For example, those in an organization responsible for rolling out an enterprise content management system will typically hail from the information technology and management group. Their focus, if we are being honest, is not on managing “content” per se but rather on managing the information transactions, and records thereof, that essentially constitute a representation of the enterprise itself from an information perspective. In a sense, then, we can say that an enterprise content management system is really managing the communication channels within an enterprise as opposed to managing the content that engenders the individual information transactions that unfold on a daily basis. An ECM therefore is not really about managing content at all. No surprise then that the people working in enterprise content management will glaze over in mystification if they encounter someone who wants to really talk about managing content itself.If we look at web content management, we find something similar. These systems don’t in fact do a particularly good job of managing the content within information transactions whether these events are web page renditions, messages provided to an interface, recommended associations, or even, heaven forbid, pop-ups. What is really being managed within a web content management system (WMCS) is the user experience and this actually makes complete sense because this is precisely where the investment in content assets will achieve its objective, or not.As a small aside, we can turn back the clock to 1995, when the web was young. At this time, a few nascent web content management systems appeared. I happened to be involved with the design and development of one of them (yes, I am that old, and yes, I am inclined to just that type of delusion). The web content management system in this case was called SpiderDocs and it was produced by a company called InContext (RIP) that specialized in technical content management using the now unfamiliar Standard Generalized Markup Language (SGML). Like others in that part of the marketplace, InContext was trying to find a way to be relevant in the new world of the web. SpiderDocs was different (then and now) in that it sought to be a WCMS that focused on managing, of all things, the content itself. The premise was that if we could manage the content, we could generate any number of web pages, and even any number of web sites, that were called for. In this regard, it was years ahead of its time.SpiderDocs was also spectacularly unpopular amongst web designers and developers. Their focus lay elsewhere. Their focus was drawn to the user experience, to how it could be made as impactful as possible, and then to how to measure that impact. Not surprisingly other WCMS products multiplied to fill this niche so that, today, there are hundreds of competing products, and many very good ones, seeking to equip web content professionals with state-of-the-art tools for designing, deploying, measuring, and improving user experiences. What we can take away from this story is that web content management succeeds or fails in how well it manages the user experience. Web content management is not really about the “content” at all and this explains why web content management systems provide such weak services for managing content assets or facilitating complex content processes. This also explains why web content strategists recoil in horror and disbelief when they encounter people who, in working in a different business silo, talk about content assets being managed as content and as being weighed in terabytes or petabytes. The most common response I get from Web Content Strategists when discussing content in this way, and on this scale, is “What’s that for?” (also heard as “WTF?”).We could go on. Similar stories can be told regarding learning content management or some sub-specialization of technical content management. And these stories will all illustrate the same point – that practitioners working in one specialized silo within the content business will not share much in common with colleagues working in other silos. Or so it perhaps seems. In truth, they do have a lot in common and they do have a lot to offer each other. In spite of their terminological efforts to the contrary, all these practitioners do have an interest in content being managed as content. And the evolution of many organizations and their drive to improve efficiency and effectiveness through integration are forcing content practitioners in different business silos to bump into each other more frequently and even, believe it or not, to collaborate.As evidence that a trend in this direction is under way, we can point to the growing interest being shown in what has sometimes been called, rather unfortunately, “component content management”. I say unfortunately because if you define and consider “content” properly then adding “component” is redundant and potentially misleading. Practitioners of “component content management” have historically been found buried within technical content management groups, or within what Sarah O’Keefe has delightfully characterized as the “TechComm Ghetto”. Solutions and techniques fashioned for this specialization emphasize open standards like the Darwin Information Typing Architecture (DITA) and the deployment of automated validation, assembly, and rendition processes. People working in other silos in the content landscape, such as in web content management, are finding that they have requirements that seem to call for many of the capabilities that have been hammered out over in this unfamiliar domain.That people are starting to reach out across the business silos, or are being forced to do so, is a very good thing. And this circumstance has encouraged some to return to the terms and definitions that are being used to describe content and the associated management practices so as to find new formulations that will appeal to practitioners across all the silos. Clearly “component content management” was not helping as a concept (uncharitably I have said that this phrase was coined by a niche vendor community who are determined, oddly, to remain a niche). One such effort can be found with “intelligent content,” a phrase that has received a fair amount of attention over the last few years. As some will know, I have jumped in and tried to contribute to this particular effort including quite recently (more on this to come).More recently, I found myself trying to explain what “real content management” meant to people who were far removed from the inner-party politics of content management. At first I tried “advanced content management”. This attempted to position the other forms of content management as focused either on parts of the content problem or on a basic level of that problem. Advanced content management then would be looking at a more complete picture or at the next level up in sophistication. It did not take me long to withdraw this candidate. It sounded more than a little boastful and this would run the risk of undervaluing the strengths seen in each of the other content specializations.And then it hit me that what I was talking about was actually Integrated Content Management. There is a sense that it surpasses the individual efforts seen in the past as enterprise, web, learning, or technical content management. But it only surpasses them in that it combines and cross-leverages their relative strengths and does so with a view to realizing an integrated benefit for the organization and for the community of content stakeholders who are inescapably involved. I like this latest formulation because it reminds us that it all comes down to integration, a quality that is intrinsic to the nature of content and that is something that we have already explored as the core challenge of content management (Why Content Technologies are Hard to Implement). Integration is also the chief benefit and opportunity that can make content management, broadly considered, a much more interesting and impactful player in how organizations operate and evolve.So in answer to my original question “Would the Real Content Management Please Stand Up?” currently we have identified a candidate in Integrated Content Management. I had thought of extending it to be Integrated Content Business Management or ICBM but sometimes acronyms shouldn t be pushed too far.There is one more candidate which I have considered in the past and that is Strategic Content Management. This one is even more ambitious and seeks to paint a picture of content and content management rising in importance and reaching a level where it can be genuinely deemed strategic . I will keep this one in reserve as it is so far beyond where most practice actually hovers that it can start to sound a little comical. It is, however, where we should really be trying to go.

TAGS:The Content Philosopher 

<<< Thank you for your visit >>>

This blog is a compendium of observations that surface amid the work, readings and travels of Joe Gollner. Subjects touched upon include knowledge management, the rise of digital content technologies, the potential role of XML in managing technology, the care and feeding of complex systems, and various topics drawn from history.

Websites to related :
PA Cider Festival 2020 - Cider-T

  Thank you for supporting PA Cider!Thank you to everyone who participated in our online fundraiser for the PA Cider Guild this week. Orders for collect

Homepage - Impact Soundworks

  Inspiration for Every Musician.We create virtual instruments and plugins to power your creativity. Hear our sounds in countless AAA video games, block

Jungleworks | Powering The On-De

  Our website uses cookies, which helps us to deliver the best customer experience. Cookie policy.Got It Get away from aggregators, launch your own bra

Paleo recipes by AmazingPaleo.co

  Your central hub to learn about all aspects of eating & living a Paleo lifestyle. With hundreds of paleo recipes to inspire you, you’ll always have t

POMPFEN - PompfenShop PompfenSho

  Willkommen beim PompfenShop. Hier gibt es die Ausrüstung zum Jugger-Sport.Jugger erfreut sich wachsender Beliebtheit. Beim Juggern kämpfen 4 Pompfe

Бесплатная онлай

  Боевые выходные: всё для фронта! 27.08.2021 Император Крейн приказал своим агентам тай

Portal - Jugger Community

  Jugger ist ein spannender, actiongeladener Teamsport, der weltweit immer mehr begeisterte Anhänger findet und bei dem es darum geht, den Spielball zu

VfL Rethwisch e.V. Verein für

  Verein Aus gegebener Veranlassung gemeinsam durch schwere Zeiten und Kündigung der Vereinsmitgliedschaft: weiterlesenHyistorie des VfL RethwischDie G

Jugger eV Jugger in Berlin

  5 vs 5Jugger ist ein Sport für zwei Mannschaften mit jeweils 5 Spielern. Ziel ist es, den Ball aus der Mitte zu erobern und den eigenen Läufer auf d

Die Jugger

  Die wichtigsten Regeln, wie und wo melde ich ein Team bei Turnieren an. Wie baue ich eine Pompfe?

ads

Hot Websites