Monday, January 14, 2013

State of Ontario's Forests Report released

According to a news release posted on http://www.woodbusiness.ca/industry-news/state-of-ontarios-forests-report-released

The Ministry of Natural Resources recently released The State of Ontario's Forests Report, (April 2004 to March 2009) and not surprisingly it reflects the recession the forest industry experienced during the period.

Produced every five years, the report highlighted the major economic downturn the forest sector experienced, including:
• Annual harvesting levels dropped by 10 million cubic metres (43%) over the five-year period;
• While employment levels increased from 2001-2004, forest sector jobs were lost at an average rate of 11% per year from 2001-2006; and
• Estimated operating surpluses (a measure of profitability) decreased by approximately $1.1 billion (57%) in the wood products sector and $814 million (29%) in the paper manufacturing sector over the five-year period.

This is Ontario's third State of the Forest Report. The reporting period for the report is April 2004 to March 2009.  This legally mandated report describes Ontario's forests and forest management, based on a criteria and indicators framework for sustainable forestry. Criteria reflect provincial forest sustainability goals and are designed to reflect public values and desired long-term outcomes for Ontario's forests. Indicators are specific measures used to assess progress towards the achievement of forest sustainability goals and objectives.

The latest State of Ontario's Forests report is comprised of a feature report and a supporting Criteria & Indicators interactive PDF. The feature report focuses on specific themes (state of the forest sector, biodiversity, forest health, and climate change) and provides a high-level summary of the Criteria & Indicators framework.

The official site for the State of the Forest Report is: http://www.mnr.gov.on.ca/en/Business/Forests/2ColumnSubPage/STDPROD_101854.html
The Report can be downloaded from: http://www.mnr.gov.on.ca/stdprodconsume/groups/lr/@mnr/@forests/documents/document/stdprod_101907.pdf

And the interactive Criteria and Indicators PDF can be downloaded from: http://www.web2.mnr.gov.on.ca/mnr/forests/public/publications/SOF_2011/toc_main.pdf

Friday, April 20, 2012

Mobilizing knowledge in the Canadian Forest Service


Through its Policy Framework for Information and Technology, Treasury Board has made it clear that information is an essential component of effective management across the Government of Canada.  The availability of high-quality, authoritative information to decision makers supports the delivery of programs and services, thus enabling departments to be more responsive and accountable to Canadians.

Accordingly, within Natural resources Canada, much effort is put into ensuring that the department’s information services, like the intranet, the wiki, and online databases, provide quick and easy access to information. Still, the uptake of research results by policy and decision makers may be lower than desired, not because of any shortcomings in a set of research results or the dissemination strategy used, but because potential users are unwilling or unable to exploit the opportunities presented to them.

Decision-making and management activities must integrate various types of information in a complex environment comprised of political, economic, and social factors.  The challenge is how to acknowledge and integrate social, economic and political context into knowledge systems. We must go beyond simply aiming to provide a “one-stop shop for information.”  Instead, what decision makers and policy analysts need is not just the facts, but, a set of solution options that are informed by research-based knowledge within the decision context

Much work has been done to look at ways to improve the uptake of research outputs by decision- and policy-makers in the health sciences field. For example, Lavis and colleagues (2003) identified three methods to enhance research utilisation by policy-makers: producer push, user pull and knowledge exchange. A similar framework for knowledge management, described as learn before, learn during and learn after, has been advanced by Collison and Parcell in their 2001 book Learning to Fly.

The paradigm of producer push, user pull and knowledge exchange includes knowledge transfer and knowledge translation, which are usually seen as a unidirectional flow of knowledge. But it also includes iterative knowledge exchange. In contrast to knowledge transfer and knowledge translation, knowledge exchange involves bringing together researchers and decision makers and facilitating their interaction, which starts with collaborating on determining the research question.  The ongoing exchange and knowledge transfer ensure that the knowledge generated is relevant and applicable to stakeholder decision making as well as useful to researchers.

Phipps and Shapson (2009) incorporated these three methods together with the concept of knowledge that is co-produced between researchers and research users to define knowledge mobilization.  They define knowledge mobilization as a suite of services using a diverse array of strategies that connect researchers and research users to enhance research utilization.

While not explicitly working in the area of knowledge mobilization, the Forest Knowledge and Information Management Division (FKIMD) of the Canadian Forest Service (CFS) is attempting to find ways to enhance the impact of CFS scientific research outputs for decision makers, policy analysts and advisors who seek quality scientific information that is relevant to their work and will impact decisions while being accessible and understandable in the shortest possible time.  The objective of FKIMD's activities is to ensure the discovery, use and dissemination of all information of business value created by the CFS for the delivery of the sector's programs and services.  As well, the division seeks to ensure that IM practices within the the sector are compliant with the Directive on Recordkeeping (as required by March 2014), and that information management standards are aligned with government standards, such as the Standard on Geospatial Data. Finding the right tools and services to effectively mobilize the organization’s knowledge is a complex problem that will evolve.

The Division has not previously described its work in terms of knowledge mobilization, but it may be useful to set the division's work in the context of the four established knowledge mobilization methods (producer push, user pull, knowledge exchange and co-production) used by Phipps and Shapson (2009) to describe the knowledge mobilization services of York University’s Knowledge Mobilization Unit. In this way, the work of FKIMD can be compared with an established knowledge mobilization unit.

The comparisons suggest that the activities being undertaken by FKIMD are consistent with Knowledge Mobilization activities undertaken by other science-based organizations and should help the CFS mobilize its knowledge to ensure that decision makers and their support staff have access to quality information that is relevant to their work and that impacts decision making.  The only notable exception to this is the relatively limited role of activities in knowledge pull, characterized at York University by the "help desk", where knowledge brokers help facilitate the introduction of a knowledge requester to a knowledge producer and support any emerging conversations that might lead to collaboration.


KMb MethodFKIMD activityNotes
Producer PushPlain language research summaries
Plain language summary that provides a concise summary of the key S&T messages or findings of an S&T publication using simple language that would be appropriate for an audience of non-technical experts in the subject of the publication.
Knowledge retention interviewsInterviews of departing employees and presentations and discussions by departing employees with peers
Directory of expertiseAn online tool to allow knowledge users to find subject matter experts in NRCan
Publications databaseAn improved database of publications that provides access to all CFS publications
User Pull
Knowledge ExchangeCommunities of practiceFostering communities of practice as stewards of knowledge
Project-level knowledge exchange strategiesPilot studies to adapt the Results Map process to bring together knowledge producers and knowledge users to co-create knowledge products
Collaboration / Co-production InfrastructureSimplified disposition authoritiesReducing the number of disposition authorities in NRCan from 64 to about 10 and expanding their coverage from about 60% of all information to 100%
Centralized digital repositoriesImplement GCDocs as a central digital repository for all digital files of business value that will replace shared drives and SharePoint.  Could have collaborative functions in the future
Paper Legacy Information StrategyA strategy to enhance the search and retrieval of CFS’ unmanaged and semi-managed paper legacy documents
Foogle - Integrated data setsAn online tool to catalogue, integrate and access CFS digital data sets
Social media to support collaborationAlthough not a product of FKIMD, the department and the GoC does have a full suite of social media tools including blogging, wikis, forums, SharePoint.  In addition, new Guidelines for External Use of Web 2.0 encourage employees to use social media tools like Twitter and Linked In.



For more detailed information, read my draft paper, Characterizing the work of Forest Knowledge and Information Division within the context of Knowledge Mobilization.



Tuesday, April 17, 2012

Setting an enabling environment for knowledge sharing

Fancy IT systems and a web of processes and procedures will not increase the flow of knowledge within an organization if the culture of the organization does not value or promote knowldege sharing.  In fact, "setting an enabling environment is fundamental to allowing sharing to happen naturally" (my emphasis added to fundamental), say Chris Collison and Geoff Parcell in their book Learning to Fly: Practical Knowledge Management from Leading and Learning Organizations.

Collison and Parcel looked at knowledge management within organizations from a variety of contexts, and determined that there are some common threads.  In particular, an enabling environment has been achieved by:

  1. A reinforcing leadership style, which challenges and encourages learning and sharing.
  2. Encouraging the right behaviours; behaviours that acknowledge peoples strengths, involve active listening, challenge the status quo, develop relationships and build trust.
  3. Taking the time to understand each other, developing shared beliefs and a common vision.
  4. Building facilitation skills to enable people to find their own solutions.
  5. Good change management capabilities, for example include those affected in the planning, the execution and the outcome.
  6. Collaborative working and learning together from shared experiences.
  7. Common technology that connects people, removes barriers, and makes information widely available.
As I read these seven points for creating an enabling environment for knowledge to flow, I immediately saw similarities with the seven challenges for a learning organization described by Bob Chartier in his book Tools for Leadership and Leaning, building a Learning Organization, and indeed, the role of the Learning Organization Community of Practice in my own organization.

In his book, Chartier describes seven challenges that must be overcome to enable leaders to tap the team's collective wisdom and to share decision-making.  These are:
  1. Shared vision and values
  2. Personal mastery
  3. Systems thinking
  4. Mental models
  5. Team Learning
  6. The Learning Vessel
  7. The art of conversation
Overcoming these seven challenges is the central domain of knowledge upon which the Learning Organization Community of Practice in my organization is focused.  Facilitation is the practice community members engage in to help the organization overcome these challenges.

It is not hard to see the links between Collison and Parcell and Chartier, and I have tried to map the linkages below:
  • A reinforcing leadership syle --> Mental Models
  • Encouraging the right behaviours -->  The art of conversation, personal mastery
  • Taking the time to understand each other --> Shared vision and values
  • Building facilitation skills --> LOCOP
  • Good change management capabilities --> Systems thinking
  • Collaborative working and learning together -->Team learning, The Learning Vessel
The mapping is not perfect and one could probably see other ways to map the two authors' key points together.  Also, Collison and Parcell's point about building facilitation skills does not link to one of Chartier's challenges, but is one of the objectives of Chartier's book and also an important role for the Learning Organization Community of Practice in my organization. Still, the mapping serves to illustrate how the efforts of our Learning Organization Community of Practice are helping to set the enabling environment for effective knowledge flow in our organization. Sometimes, the culture change, which is slow and subtle, goes unseen in the light of other flashy IT products like wikis, search tools, forums, blogs and directories of expertise.


Friday, March 30, 2012

To get people to learn, you must give them a reason to care

I stumbled across this recent blog post about "Learning as Care" by Nick Shackleton-Jones recently.  In the post, Shackleton-Jones makes the point that the most important part of getting anyone to learn something is getting them to care.  Here are some highlights from his post:

  • if people really cared about something we would have no work to do. And if we can't make people care, then we have usually done no work
  • we disseminate information without giving people a reason to care
  • we fail to provide learning resources to people who do care
  • don't tell people what is important, tell them?why, tell the story
  • care is the central mechanism at the heart of all human learning - it governs both how we store information and how we subsequently use it
This is very timely information for me as we prepare to find new ways to get staff to manage their paper documents properly, a topic that our surveys show staff care about, but are intensely frustrated by.

Friday, October 28, 2011

The power of storytelling: strengthening science-policy integration when times are uncertain, and the ideal future state cannot be described

When I used to work as a scientist in a regional science and technology unit for the Government of Ontario, there was popular, somewhat tongue-in-cheek, refrain at the end of meetings between scientists and policy analysts. The policy analysts would say, "You scientists never give us an answer we can use!" To which the scientists would retort, "You policy folks never ask us a question we can answer!" For as long as I have been working in government, people have been trying to improve the integration between policy and science.  Yet we are still at it and success seems as elusive as ever.

There are ample reasons why science and policy are so difficult to integrate, and there is lots of good work being done to try and address this issue.  But a recent half-day session with David Snowden gave me new inspiration into how one might tackle this particularly thorny issue.

If I have captured the parlance of Snowden's Cynefin framework correctly, then I think science-policy integration must be a complex problem.  That is to say, a problem in which cause and effect are only coherent in retrospect and do not repeat, there are no right answers and many competing ideas, and where problems and solutions interact so the system is constantly evolving.

Snowden had a couple of pieces of advice that resonated with me when it comes to solving complex problems.

First, he says, don’t waste time trying to figure out what to do. Instead, probe with small experiments, then monitor and adapt.  Since you cannot define what the ideal future state will be, start with a good definition of the present and move forward with safe-to-fail experiments that may lead to unforeseen outcomes.  It’s cheaper and more successful.  If you can accept that your theory for proceeding is coherent with the facts - with the way you understand the present - then you can move to a place where the outcome is uncertain.  In other words, you have safety in direction, not safety in outcome.

Second, you need to monitor the experiments carefully, with impact indicators, not output indicators. In complex problems, argues Snowden, you cannot manage the outcomes because they are emergent.  However, you can manage the boundaries of the issue you wish to deal with, the tools and processes you put in place to influence the patterns of behaviours in the system, and the resources devoted to amplifying positive patterns and dampening negative patterns. Snowden gave an example of how focusing on outcome indicators can derail solving complex problems.  In the UK, a hospital authority decided that it was unacceptable to have people in the emergency waiting room for longer than 4 hours and in the emergency ward for longer than 48 hours. The result was that patients were not properly triaged or treated.  They were pushed through the system and on to the wards based on how long they had been there, not based on their medical need.  The quality of care did not increase, but the emergency room met its targets.

Third, it is really difficult to address complex problems directly.  Instead, address them obliquely.  Many complex problems are about changing organizational culture.  But it is very hard to change people. Instead, argues Snowden, change the system and the people will change to match it.  Nobody, for example, is going to share information across silos just because they had a workshop and were told they should share.

So what does all this say about developing evidence-based policy to strengthen science-policy integration when times are uncertain, and the ideal future state cannot be described?

Snowden gave a number of examples of work he has done to solve complex problems using self-indexed micro-narratives, which may be relevant to strengthening science-policy integration in an organization such as the one I work in.

At the heart of Snowden’s examples is a process by which people are asked to first tell a story about a particular topic and then to score or weight their story using a carefully constructed index.  The stories are recorded in any number of formats: written, audio, video.  The format is not important, as long as they are left unfiltered and are not summarized.  The index is similar to keywords used to describe the story, but much more sophisticated.  The index takes the form of a triangle on which the storyteller is asked to place a dot.  At each point of the triangle are carefully selected keywords. The storyteller is asked to place the dot in the triangle closest to the word that describes their story.  When the storyteller places the dot, it gives three quantitative weights – one for each choice between two points of the triangle.  These weights can be used to plot the stories on a 3-D graph. Storytellers are often asked to score their story on several indexes, which can be recombined to create different graphs.  Similarly scored stories show up as clusters on the graph.

Snowden gave an example of this technique from when he worked with the CIA in the ‘70s.  The CIA funded an American university to work with a French University to study attitudes in Iran. The French University had some Iranian professors, and as part of the study the professors asked Iranians to tell them stories about Iran and then self-index the stories.  After collecting about 18,000 stories, two clear clusters of stories emerged.  One cluster was stories that related a strong dislike of America.  The other cluster was stories that related a strong dislike of the West.  This was not too encouraging, but they continued collecting stories.  After 21,000 stories, a third cluster emerged.  This cluster was comprised of stories that related the concept of not wanting to be seen as a barbarian.  Snowden recognized that this was the opportunity for intervention; that if the US could somehow emphasize the later story, it might drain energy away from the other stories.

Snowden did not go into detail about what the CIA did, but he did give more detail about a project he is working on right now in Mexico City.  This project is focused on changing the culture of violence associated with gangs and drugs.  They have collected about 200,000 self-indexed stories from ordinary people on the street.  When they analyze the stories, they are confident that a cluster of stories will emerge about the violent gang culture.  However, they also believe that a number of positive stories will emerge.  Once they find out what those positive stories are, they will work with experts in Hollywood to create films, TV spots, multi-media presentations, whatever it takes to emphasize the positive stories, and hopefully, drain energy away from the negative stories.

For Snowden, the culture of a society or an organization is wrapped up in its stories.  If you can change the stories people tell, you have changed the culture.

So how does this relate to science-policy integration?

I think that strengthening science-policy integration within a science-policy organization is actually a culture change problem.  So, what if we took this approach:

  1. Record stories from employees in the organization about their science-policy interaction experience and have them self-index the stories on carefully selected indexes (e.g. is the behaviour in this story best described as "competitive", "cooperative", or "altruistic"), 
  2. Graph the stories to find clusters of positive behaviours that might represent opportunities to intervene.
  3. Develop and implement some safe-to-fail policies, guidelines, or tools to reinforce the positive behaviours and dampen the negative ones,
  4. Recollect stories, perhaps a year later, and see if the clusters of stories have moved one or two index points toward more desirable values. Resources are given to tools that seem to be working and taken away from the tools that are not working.

Success is measured by the index values of the stories, which measure the impact of the actions taken to influence the system.  Success is not measured by output indicators, like the number of meetings scientists and policy analysts had.

There is obviously a lot of detail I am missing here, and I need to familiarize myself more with Snowden's techniques.  But at first blush, this seems like a promising approach to strengthening science-policy integration in a complex environment.

Chefs versus recipe users: LOCOP as an apprentice program for leadership

Can NRCan’s Learning Organization Community of Practice (LOCoP) be thought of as an apprentice program for leaders?

Recently, I attended a half-day session with David Snowden, author of the Cynefin framework for solving problems. Snowden makes a distinction between how one should solve complex problems, versus how one should solve simple or merely complicated problems. I won’t go into details here, but suffice it to say, that in a knowledge-based economy where innovation is required, we need the type of people who can solve complex problems. In other words, we need chefs, not recipe users!

Snowden made the point that there is a big difference between a chef and a recipe user. Sure, if you have all the right equipment in your kitchen, you lay out all the tools and necessary ingredients and you have a good recipe to follow, then just about any competent person can produce a reasonably good meal. But only a chef can walk into your kitchen, see what’s in the fridge, and create a truly exceptional meal.

The difference, Snowden asserts, is that chefs possess practical wisdom.

Wisdom is the ability to reflect on one’s knowledge or experience. Practical, here, means it was acquired through the process of practice – in a chef’s case, as an apprentice.

The beauty of the apprentice model is that it allows someone to imperfectly mimic the master and make mistakes. Studies have shown that people recall far more knowledge when they actually act on their knowledge than when they just think about it. In an apprenticeship program, one practices what one has learned from books, but in an environment where it is safe to make mistakes. The result is a much greater ability to recall and reflect on that knowledge for innovative results.

Snowden also made the point that doctors and lawyers also use the apprentice model, but managers have no such system; instead they have the MBA.

That’s when I stated to re-think the role of our Learning Organization Community of Practice as an apprentice program for leaders. When I first took my LOCOP training, I came out of that training thinking of myself as an apprentice - but an apprentice in facilitation. Now, I recognize that I am really an apprentice in becoming a leader.

Every time I use my LOCOP facilitation tools to develop a shared vision in a team, to think about the whole puzzle at once, to create space for new learning, to foster deep reflective listening and build shared meaning in conversation rather than argument, I am conducting a small, safe-to-fail exercise in which I practice the theory I learned in my original training. The result is that I now have a bucket of tools in my back pocket that I can mix and match and modify to solve all kinds of problems in a collaborative and increasingly innovative way.

Add to that the value of having a community who I can learn new techniques from, who I can validate my own ideas with, and who I can call on to help me solve tough problems, then I think we have many of the essential elements of a low-cost apprentice program for leaders right in my place of work.

Saturday, October 15, 2011

A knowledge management conundrum: how to share secret information

One KM issue that has sat in the back of my mind for some time is how to share information among employees that is classified as secret.

We spend a lot of time in our workplaces implementing document management solutions like SharePoint, writing collaboratively on wikis, fostering knowledge exchange through communities of practice, etc. - basically trying to make the knowledge contained in the organization findable and retrievable to contribute to evidence-based decision making

But none of these tools can address the issue of how to share secret information.  Documents classified as secret hold a wealth of valuable data, opinion and insight, and should be a part of an organization's evidence base for decision making in a form that is findable and retrievable to the person with the right security classification and a clearly demonstrated need to know.

That's the thing about documents classified as secret; they can be shared with someone if the recipient has a clearly demonstrated need to know, but cannot be made freely available to people to trawl through on the possibility they might find something useful - even if the searcher has a secret-level security clearance.

Not surprisingly, this is not a new challenge for intelligence organizations. Recently, I participated in a workshop with David Snowden, who gave me some insight into how US intelligence agencies deal with this challenge.

According to Snowden, in the CIA of a few years ago, when an intelligence officer would receive a piece of intelligence to review, say an intercepted phone call or email, they would analyze it, write a short report about it, and file it. It was difficult to share the information, particularly among agencies, because it was all secret. Connecting the dots between pieces of intelligence to create a big picture view generally relied on officers remembering what they read. But sometimes, they might have read it years ago.

So instead, the CIA started a process whereby when an officer received a piece of intelligence, the officer would index the intelligence using carefully constructed quantitative indexes (kind of like key words, but more sophisticated. For example, an index might ascribe a weight to a piece of intelligence that depends on whether the intelligence is associated with the Middle East, Europe, or North America ).

Because each intelligence piece now has quantitative indexes associated with it, the data can be analyzed statistically or plotted on a 2-D or 3-D graph to search for patterns.  When patterns emerge, such as a cluster of data points, the records associated with these data points can be requested by the intelligence officer because he can clearly demonstrate a need to know.

Furthermore, this quantitative metadata about the intelligence records can be shared with other agencies, who might filter it or analyze it in different ways, or add their own data  to search for other patterns.  If they find a pattern, they can request the relevant records because they have the appropriate clearance level and can clearly demonstrate a need to know.

In my own organization secret documents are locked away in secure cabinets or stored on computers that are not connected to the network. Even worse, documents may not be declassified when the need to keep them secret no longer exists.  No matter how useful a Memorandum to Cabinet, for example, might be to me, I have no way to know it even exists.

So now I am wondering if it would be possible in my own organization to have every secret document  indexed by the author and the metadata made available for analysis. If so, a whole world of organizational knowledge could be made available to those with a need to know to inform evidence-based decision making.