Skip To Content

Technology Enhanced Knowledge Research Institute (TEKRI)

TEKRI blogs

Learning as Artifact Creation

elearnspace (George Siemens) - September 14, 2017 - 13:04

Digitization is deceptive in that the deep impact isn’t readily observable. Remember when MOOCs were going to transform higher education? Or when personalized learning was going to do away with instructors? Going back about a century ago, audio, then video, was also going to disrupt education. All of these trends have been window dressing – a facade more reflective of the interests of those who advocate for them rather than a substantive departure from established norms.

Yet, change is happening, often under the radar of enthusiasts because it’s harder to sell a technology product or draw clicks to a website when being nuanced and contextual. Education is an idea/information-based process. How information is accessed, created, and shared is revealing about the future of learning. Essentially, follow information in order to understand the future of higher education. Today, information is networked and digital. University transformation and proposed innovation should align to this reality to have a broad impact – notably on student learning and the development of knowledge in society.

In 2004, I tried to respond to the network/digitization alpha trend by describing the new actions and experiences that were available to learners: Connectivism: A Learning Theory for the Digital Age. This article and work with Stephen Downes formed subsequent development of MOOCs and learning analytics.

Connectivism was presented as a theory that described how learning happened in networks, complex ambiguous information spaces, digital environments, and the increased opportunities of the participative web. Unfortunately, much of that theory remains undeveloped. Details regarding cognitive processes, teacher actions, learner mindsets, design models, and social interaction remain rudimentary. I’m confident that these will be developed over time, but progress has been slow. As a result, connectivism is something people cite rather than engage and develop into a more complex theory or framework of learning. But, whether connectivism or by some other name, a social networked model of learning is our future.

Enter the artifact…

One aspect of connectivism that has great potential for development is the role of the artifact in learning. With CCK08, we found fascinating activities arising due to student created artifacts. One student creates an image to detail the architecture of the course. Another updates it and adds to it. Another comes by and critiques it. The artifact serves as a social learning object. This process reflects my earlier point: big trends unfold behind the scenes over time and in education, they map and mirror to what people do with information that is digital and networked.

Here’s an example: Education over the past several centuries has been defined by the centrality of the instructor and the actions of a learner in relationship to what the instructor knows. There have always been voices that challenged this model – Dewey, Illich, Freire, Montessori – but the system of learning that defines our society is modeled on the assumption of learners needing to duplicate what instructors already know. Learning artifacts – a paper or a test – were largely held between the instructor and student. Group work and class presentations brought others into the relationship, but the message was still clear: the instructor and the content were central, all else was held in their orbit.

Then the internet happened. And later the web. Small groups of people could share without a mediator. You didn’t need a publisher to blast your thoughts to a bulletin board. Yahoo groups, Friendster, and other early social software didn’t fully live up to the vision of the mesh, but they enabled communication. Content creation was still largely the domain of experts or people in positions of control. Britannica and newspapers were still gatekeepers. Then, the late 1990′s rolled around and Blogger made self-publishing reasonably accessible to anyone.

We could now create artifacts, not only talk about them.

A stunning period web innovation occurred between 2000-2005: delicious, myspace, many blog platforms, flickr, wikis, etc. The gates were opened and everyone was a content creator and everything was subject to user creation. Everything was a possible social artifact. Take and share a picture. Post your thoughts on a blog. Tag and share valuable resources. The web had its velveteen rabbit moment and became real to people who had previously been unable to easy share their creative artifacts. Eventually we were blessed with the ugly stepchildren of this movement (Twitter, Facebook) that enabled flow of creative artifacts but in themselves where not primarily generative technologies.

Educationally, this provided new opportunities for students. That class lecture that didn’t make sense? There is a better resource online. That stats textbook that is confusing? There is a MOOC for that. Don’t like a class? Tell the web. Don’t like your instructor? Tell the web (rate my prof). Have an important thought to share? Upload a video to youtube. An awesome song? Upload. Share. A terrific painting you’ve been working on? Upload. Share.

Consider the impact of these opportunities on education and how poorly the higher education system has responded. Consider our curriculum as a self-contained coherent resource. The goal of education? Teach this container to the students. What happens when you add artifact creation? The entire curriculum can shift. If I lecture on the development of open learning and open source technologies, I’m presenting my voice, my priorities, my values. If someone comes along and says “what about the power structure and the bias that underpins this content”? Bam. It’s a new course. Someone creates a video reacting to a lecture I delivered? Bam. It’s a new course. This doesn’t always happen on grand scales. Often the artifact has a limited impact – a brief detour in a new conversation and learning direction for students. The aggregate of these artifacts is significant because it places students in a new mindset, one defined by personal autonomy and agency.

All of this is obvious. It’s mainly about the permissions that technology enables: namely, to write ourselves and our values into any curriculum and learning interaction. The impact that this has on the learning experience is not well understood. We have theories of community (community of inquiry, community of practice). We have many theories of content and content interaction (including transactional distance). There is something about the artifact that is unique in its ability to make every learner a teacher, every contribution a redirection of learning, every interaction a reaction and augmentation.

In one of our LINK grants we currently exploring the power of artifacts as redirective entities (an NSF grant titled COCOA). How does creating a blog post, video, or meme contribute to enlarging the curriculum? How do artifacts contribute to, and take away from, the course content? What is a well-designed artifact? What causes resonance between learner resources being shared and students that respond to those resources?

Learning Analytics Courses

elearnspace (George Siemens) - September 11, 2017 - 11:48

After about a year of planning, we can finally announce the following courses on edX focusing on learning analytics. The intent of these courses is to eventually lead into a MicroMasters and then advance placement in an in-development Masters of Science in Learning Analytics at UTA. Each course runs about three weeks and we’ve tried to settle on prominent analytics tools for educational data so the experience is one where skills can immediately be applied.

We’ve structured these courses to provide an intro to analytics in education (a good compliment to the courses is our recently released Handbook of Learning Analytics – free download):

  • Learning Analytics Fundamentals
  • Social Network Analysis
  • Cluster Analysis
  • Predictive Modeling in Learning Analytics
  • Feature Engineering for Improving Learning Environments
  • Connecting Learning Data to Improve Instructional Design
  • Natural Language Processing and Natural Language Understanding in Educational Research
  • Knowledge Inference and Structure Discovery for Education
  • Multi-modal Learning Analytics
  • We have exceptional instructors – world leaders in the field. We are, however, well aware of the gender imbalance. We had five faculty (women) who ended up dropping out due to existing commitments. If you’d like to help right this imbalance, email me and let me know courses or topics that you’d like to instruct in the LA domain.

    Open Education: Please give

    elearnspace (George Siemens) - September 8, 2017 - 08:44

    In about a month, David Wiley and I are teaching this course on edX: Introduction to Open Education. As we are both firm adherents to social and participatory pedagogical models (i.e. we like it when others do our work), we need some help. Specifically, we’d love to have faculty/researchers/practitioners provide short 3-5 minute reflections on one or more of the following topics:

    Week 1: Why Open Matters
    Week 2: Copyright, The Public Domain, and The Commons
    Week 3: The 5R Activities and the Creative Commons Licenses
    Week 4: Creating, Finding and Using OER
    Week 5: Research on the Impact of OER Adoption
    Week 6: The Next Battles for Openness: Data, Algorithms, and Competency Mapping

    The process:
    1. Create a short video/tutorial or any other artifact (if we have yours by Sept 14, we’ll include it in the course) responding to any of the above weekly topics
    2. Upload your creation to some site where we can access/download it
    3. Email me a link (gsiemens, gmail) or share on Twitter using #openedmooc or leave a link in the comments
    4. Sit back and enjoy the feeling of accomplishment that will wash over you knowing you’ve made the world a better place.

    New Project: Digitizing Higher Education

    elearnspace (George Siemens) - June 8, 2017 - 11:51

    In fall, I’ll be running a course on edX with a few colleagues on Digitizing Higher Education. This course is part of a larger initiative that I’ll be rolling out later this month focused on helping universities transition into digital systems: University Networks.

    Here’s the pitch:

    Higher education faces tremendous change pressure and the resulting structures that are now being formed will alter the role of universities in society for the next several generations. The best time to change systems is when it is already experiencing change. A growing number of consulting agencies and service providers are starting to enter the higher education space, bringing visions that are not tightly focused on learner development and service to knowledge advancement in research domains – i.e. a shift to utilitarian views of education. I’m concerned that in the process, universities will lose control over their enterprise and will become some version of corporate lite.

    I recognize that universities need to change. They need to start with a basic question: If we were to create a model of higher education today that serves the needs of learners and society, what would it look like given our networked and technologically infused society? . The answer is not pre-existing. It’s something that we need to explore together. Societies and regions that make this change will benefit from increased employment opportunities for citizens, higher quality of life, and greater control over their future.

    The project, University Networks, involves working with a small number of universities, or specific faculties and departments, that are committed to rethinking and redesigning how they operate. My goal is to bring on 30 universities and over a period of 4 years, rethink and redesign university operations to align with the modern information and knowledge ecosystem. The intent is to impact 1 million learners over the next four years through offering innovative teaching and learning opportunities, utilizing effective learning analytics models, integrating learning across all spaces of life, and creating a digital and networked mindset to organization operations.

    A few details:

    • This is a cohort model where universities learn from each other and share those resources and practices that can be shared – for example, shared curriculum and shared quality rubrics. The cohort model enables more rapid change since the investments of all universities in the network will increase the value of the resources for everyone.
    • We provide centralized consultancy (this is a non-profit) where we enter a university for two weeks of in-depth analysis of existing practices and work with leadership to plan future investments and goals. Once this analysis is done, each university will enter one of ten modules based on their current progress. For example, a university without an LMS will enter module one whereas a university with advanced infrastructure but looking to develop online programs will enter at module four.
    • The shared consultancy and cohort model results in universities working with a fraction of the investment needed in working with a traditional corporation or consultancy firm. Clearly enabling partners will be needed and we’ll support and advise in that area as well. Our focus, however, is on rapid innovation owned and controlled by the university.
    • My motivation for this is twofold: 1. to serve the advancement of science through modern universities that reflect the information age and the changing economy. 2. to actively research systemic transformation in higher education.
    • As partners in university innovation, we (through Interlab) have deep expertise in machine learning, systemic innovation, networked learning, online learning, and digitization of organizations. More on our group here: http://interlab.me/collaboration/. What does this mean? Basically that we are committed to repositioning higher education for the modern era and that we have the skillsets to deliver on that commitment.
    • If you are interested in learning more, please email me: contact me. We are hosting an information event on June 30. We’ll provide more information at that time about the project, getting involved, and our expectation of university partners.

      We have an excellent advisory board directing this project:

    • John Galvin (Intel)
      Dror Ben-Naim (Smart Sparrow)
      Katy Borner (Indiana University)
      Al Essa (McGraw-Hill)
      Casey Green (Campus Computing Project)
      Sally Johnstone (NCHEMS)
      Mark Milliron (Civitas)
      Catherine Ngugi (Open Education Africa)
      Deborah Quazzo (GSV Advisors)
      Matt Sigelman (Burning Glass)

    Handbook of Learning Analytics (open)

    elearnspace (George Siemens) - June 7, 2017 - 08:23

    When we started the learning analytics conference in 2011, we aligned with ACM. We received a fair bit of criticism for not pursuing fully open proceedings. Some came from our sister organization, IEDMS, that has open proceedings. We made a difficult choice to go with the traditional route of quality, indexed proceedings, largely in order to ensure that colleagues from Europe and Latin America could receive funds for their travels. It’s often not understood by advocates for openness that a key challenge for researchers is to publish for impact or publish for prestige. Prestige, as defined by so called “reputable” journals, is often a requirement for getting government funding for travel.

    To ensure broader dissemination, and cope with our guilt, of our research, we set up an open journal: Journal for Learning Analytics.

    I’m very excited about a new project that started as an idea during LAK13 in Leuven and is another commitment to openness by the Society for Learning Analytics Research: The Handbook of Learning Analytics. The book, CC-licensed, weighs in at 356 pages and provides a good snapshot of the status of learning analytics as a field. It’s a free download (both the book and the chapters). Given the number of masters programs that now incorporate learning analytics courses, or a growing number of LA masters programs, we felt it was important to get a research document into the public space.

    Understanding Mobile Learning at Athabasca University through MobiGlam (UMLAUT-M): Do the Benefits Justify the Cost and Time? at the 2008 International Conferfence on Mobile Learning

    Mohammed Ally's publications - May 29, 2017 - 08:00
    Title: Understanding Mobile Learning at Athabasca University through MobiGlam (UMLAUT-M): Do the Benefits Justify the Cost and Time? at the 2008 International Conferfence on Mobile Learning

    Authors: Ally, Mohamed

    Abstract: 1. The goal of the UMLAUT-M project is to investigate the viability and pedagogic usefulness of mobile access to online course materials. Although the project team is composed of researchers and programmers from the United Kingdom and Canada, the project itself is being conducted at a distance University in Canada. Students of this university receive textbooks, manuals, and other materials through the mail. Currently, there is some provision for person to person interaction through telephones, the learning management system (Moodle), and various other electronic tools. Yet, there remains an apparent lack of "connectedness" among learners because of the physical and temporal separation of the instructors and the learners. The project tested a system called MobiGlam which allows students to access Moodle courses through a variety of mobile devices such as cellular telephones, PDAs, and smartphones.

    Description: This first presentation was attended by approximately 40 people from many countries. The first part of the paper presented information on why the research study was conducted while the second part of the paper presented details on the research study. Many attendees are starting to use Moodle as the Learning Management System (LMS) and are interested in finding out how AU is planning to use Moodle on mobile devices. At the end of the session there were questions on the implementation of Moodle on mobile devices. The time for the presentation was short. It would have been helpful to have a longer session.
    Categories: Publications

    Our trip to Italy – April 2017

    Terry Anderson's blog - May 1, 2017 - 20:48
    Note: What follows is a 6 page account of the 24 days that Susan and I spent as tourists in Italy in April 2017. Hopefully it can be used by ourselves to recall those names and dates we too easily forget and for others to help plan similar vacations. Introduction: Despite the numerous personal and […]

    What the FOLC is new in this article?

    Terry Anderson's blog - April 28, 2017 - 02:17
    Sorry, but I couldn’t resist spoofing, in the post title,  the unfortunate sound of the acronym for the “new” model proposed in this article. Now,  I’ve got it out of the way and can only suggest that if this “divergent fork of the Community of Inquiry model” is to survive, it needs a new English […]

    Two Canadian Movies – Two Canadian Narratives

    Terry Anderson's blog - March 5, 2017 - 16:07
    I’ve just finished watching two films about, paid for and watched – with interest, by many Canadians. The first film, TransCanada Summer takes the viewer across the country in 1958. Throughout the trip the film celebrates the industrialization and progress resulting from the construction of the Trans Canada Highway – the longest highway in the world at that […]

    When cats play sax

    Jon Dron's blog - October 29, 2016 - 13:16

    This is what happened when Beelzebub the Cat decided to try to play my saxophone after I had foolishly left it on its stand without its mouthpiece cap.

    He seriously needs to work on his embouchure.

    I seriously need to disinfect my mouthpiece.

    True costs of information technologies

    Jon Dron's blog - August 16, 2016 - 18:13

    Microsoft unilaterally and quietly changed the spam filtering rules for Athabasca University's O365 email system on Thursday afternoon last week. On Friday morning, among the usual 450 or so spams in my spam folder (up from around 70 per day in the old Zimbra system) were over 50 legitimate emails, including one to warn me that this was happening, claiming that our IT Services department could do nothing about it because it's a vendor problem. Amongst junked emails were all those sent to the allstaff alias (including announcements about our new president), student work submissions, and many personal messages from students, colleagues, and research collaborators.

    The misclassified emails continue to arrive, 5 days on.  I have now switched off Microsoft's spam filter and switched to my own, and I have risked opening emails I would never normally glance at, but I have probably missed a few legitimate emails. This is perhaps the worst so far in a long line of 'quirks' in our new O365 system, including persistently recurring issues of messages being bounced for a large number of accounts, and it is not the first caused by filtering systems: many were affected by what seems to be a similar failure in the Clutter filter in May.

    I assume that, on average, most other staff at AU have, like me, lost about half an hour per day so far to this one problem. We have around 1350 employees, so that's around 675 hours - 130 working days - being lost every day it continues. This is not counting the inevitable security breaches, support calls, proactive attempts at problem solving, and so on, nor the time for recovery should it ever be fixed, nor the lost trust, lost motivation, the anger, the conversations about it, the people that will give up on it and redirect emails to other places (in breach of regulations and at great risk to privacy and security, but when it's a question of being able to work vs not being able to work, no one could be blamed for that). The hours I have spent writing this might be added to that list, but this happens to relate very closely indeed to my research interests (a great case study and catalyst for refining my thoughts on this), so might be seen as a positive side-effect and, anyway, the vast majority of that time was 'my own': faculty very rarely work normal 7-hour days.

    Every single lost minute per person every day equates to the time of around 3 FTEs when you have 1350 employees. When O365 is running normally it costs me around five extra minutes per day, when compared with its predecessor, an ancient Zimbra system.  I am a geek that has gone out of his way to eliminate many of the ill effects: others may suffer more.  It's mostly little stuff: an extra 10-20 seconds to load the email list, an extra 2-3 seconds to send each email, a second or two longer to load them, an extra minute or two to check the unreliable and over-spammed spam folder, etc. But we do such things many times a day. That's not including the time to recover from interruptions to our work, the time to learn to use it, the support requests, the support infrastructure, etc, etc.

    To be fair, whether such time is truly 'lost' depends on the task. Those 'lost' seconds may be time to reflect or think of other things. The time is truly lost if we have to put effort into it (e.g. checking spam mail) or if it is filled with annoyance at the slow speed of the machine, but may sometimes simply be used in ways we would not otherwise use it.  I suspect that flittering attention while we wait for software to do its thing creates habits of mind that are both good and bad. We are likely more distracted, find it harder to concentrate for long periods, but we probably also develop different ways of connecting things and different ways of pacing our thinking. It certainly changes us, and more research is needed on how it affects us. Either way, time spent sorting legitimate emails from spam is, at least by most measures of productivity, truly time lost, and we have lost a lot of it.

    Feeding the vampires

    It goes without saying that, had we been in control of our own email system, none of this would have happened. I have repeatedly warned that putting one of the most central systems of our university into the hands of an external supplier, especially one with a decades-long history of poor software, broken or proprietary standards, weak security, inadequate privacy policies, vicious antagonism to competitors, and a predatory attitude to its users, is a really stupid idea. Microsoft's goal is profit, not user satisfaction: sometimes the two needs coincide, often they do not. Breakages like this are just a small part of the problem. The worst effects are going to be on our capacity to innovate and adapt, though our productivity, engagement and workload will all suffer before the real systemic failures emerge.  Microsoft had to try hard to sell it to us, but does not have to try hard to keep us using it, because we are now well and truly locked in on all sides by proprietary, standards-free tools that we cannot control, cannot replace, cannot properly understand, that change under our feet without warning, that will inevitably insinuate themselves into our working lives. And it's not just email and calendars (that can use only slightly broken standards) but completely opaque standards-free proprietary tools like OneDrive, OneNote and Yammer. Now we have lost standards-compliance and locked ourselves in, we have made it unbelievably difficult to ever change our minds, no matter how awful things get. And they will get more awful, and the costs will escalate. This makes me angry. I love my university and am furious when I see it being destroyed by avoidable idiocy.

    O365 is only one system among many similar tools that have been foisted upon us in the last couple of years, most of which are even more awful, if marginally less critical to our survival. They have replaced old, well-tailored, mostly open tools that used to just work: not brilliantly, seldom prettily, but they did the job fast and efficiently so that we didn't have to. Our new systems make us do the work for them. This is the polar opposite of why we use IT systems in the first place, and it all equates to truly lost time, lost motivation, lost creativity, lost opportunity.

    From leave reporting to reclaiming expenses to handling research contracts to managing emails, let's be very conservative indeed and say that these new baseline systems just cost us an average of an extra 30 minutes per working day per person on top of what we had before (for me, it is more like an hour, for others, more).  If the average salary of an AU employee is $70,000/year that's $5,400,000 per year in lost productivity. It's much worse than that, though, because the work that we are forced to do as a result is soul-destroying, prescriptive labour, fitting into a dominative system as a cog into a machine. I feel deeply demotivated by this, and that infects all the rest of my work. I sense similar growing disempowerment and frustration amongst most of my colleagues.

    And it's not just about the lost time of individuals. Almost always, other people in the system have to play a role that they did not play before (this is about management information systems, not just the digital tools), and there are often many iterations of double-checking and returned forms,  because people tend to be very poor cogs indeed.  For instance, the average time it takes for me to get recompense for expenses is now over 6 months, up from 2-4 weeks before. The time it takes to simply enter a claim alone is up from a few minutes to a few hours, often spread over months, and several other people's time is also taken up by this process. Likewise, leave reporting is up from 2 minutes to at least 20 minutes, usually more, involving a combination of manual emails, tortuous per-hour entry, the ability to ask for and report leave on public holidays and weekends, and a host of other evils. As a supervisor, it is another world of pain: I have lost many hours to this, compounding the 'mistakes' of others with my own (when teaching computing, one of the things I often emphasize is that there is no such thing as user error: while they can make mistakes and do weird stuff we never envisaged, it is our failure to design things right that is the problem). This is not to mention the hours spent learning the new systems, or the effects on productivity, not just in time and motivation, but in preventing us from doing what we are supposed to do at all. I am doing less research, not just because my time is taken with soul-destroying cog-work, but because it is seldom worth the hassle of claiming, or trying to manage projects using badly designed tools that fit better - though not well - in a factory. Worse, it becomes part of the culture, infecting other processes like ethics reviews, student-tutor interactions, and research & development. In an age when most of the world has shaken off the appalling, inhuman, and empirically wrong ideas of Taylorism, we are becoming more and more Taylorist. As McLuhan said, we shape our tools and our tools shape us.

    To add injury to insult, these awful things actually cost money to buy and to run -  often a lot more money than they were planned to cost, making a lot less savings or even losses, even in the IT Services department where they are justified because they are supposed to be cutting costs. For instance, O365 cost nearly three times initial estimates on which decisions were based, and it appears that it has not reduced the workload for those having to support it, nor the network traffic going in and out of the university (in fact it may be much worse), all the while costing us far more per year to access than the reliable and fully-featured elderly open source product it replaced. It also breaks a lot more. It is hard to see what we have gained here, though it is easy to see many losses.

    Technological debt

    The one justification for this suicidal stupidity is that our technological debt - the time taken to maintain, extend, and manage old systems - is unsustainable. So, if we just buy baseline tools without customization, especially if we outsource the entire management role to someone else, we save money because we don't have to do that any more.

    This is - with more than due respect - utter bullshit.

    Yes, there is a huge investment involved over years whenever we build tools to do our jobs and, yes, if we do not put enough resources into maintaining them then we will crawl to a halt because we are doing nothing but maintenance. Yes, combinatorial complexity and path dependencies mean that the maintenance burden will always continue to rise over time, at a greater-than-linear rate. The more you create, the more you have to maintain, and connections between what we create adds to the complexity. That's the price of having tools that work. That's how systems work. Get over it. That's how all technology evolves, including bureaucratic systems. Increasing complexity is inevitable and relentless in all technological systems, not withstanding the occasional paradigm shift that kind-of starts the ball rolling again. Anyone that had stuck around in an organization long enough to see the long-term effects of their interventions would know this.

    These new baseline systems are in no way different, save for one: rather than putting the work into making the machines work for us, we instead have to evolve, maintain and manage processes in which we do the work of machines. The complexity therefore impacts on every single human being that is having to enact the machine, not just developers. This is crazy. Exactly the same work has to be done, with exactly the same degree of precision as that of the machines (actually more, because we have to add procedures to deal with the errors that software is less likely to make). It's just that now it is done by slow, unreliable, fallible, amotivated human beings. For creative or problem-solving work, it would be a good thing to take tasks away from machines that humans should be doing. For mechanistic, process-driven work where human error means it breaks, it is either great madness, great stupidity, or great evil. There are no other options. At a time when our very survival is under threat, I cannot adequately express my deep horror that this is happening.

    I suspect that the problem is in a large part due to short-sighted local thinking, which is a commonplace failure in hierarchical systems, and that gets worse the deeper and more divisive the hierarchies go.  We only see our own problems without understanding or caring about where we sit in the broader system. Our IT directors believe that their job is to save money in ITS (the department dealing with IT), rather than to save money for the university. But, not only are they outsourcing our complex IT functions to cloud-based companies (a terrible idea for aforementioned reasons), they are outsourcing the work of information technologies to the rest of the university. The hierarchies mean a) that directors seldom get to see or hear of the trouble it causes, b) they mix mainly with others at or near their hierarchical level who do not see it either, and c) that they tend to see problems in caricature, not as detailed pictures of actual practices. As the hierarchies deepen and separate,  those within a branch communicate less with others in parallel branches or those more than a layer above or below. Messages between layers are, by design, distorted and filtered. The more layers, the greater the distortion. People take further actions based on local knowledge, and their actions affect the whole tree. Hierarchies are particularly awful when coupled with creative work of the sort we do at Athabasca or fields where change is frequent and necessary. They used to work OK for factories that did not vary their output much and where everything was measurable though, in modern factories, that is rarely true any more. For a university, especially one that is online and that thus lacks many of the short circuits found in physical institutions, deepening hierarchies are a recipe for disaster. I suppose that it goes without saying that Athabasca University has, over the past few years, seen a huge deepening in those hierarchies.

    True costs

    Our university is in serious financial trouble that it would not be in were it not for these systems. Even if we had kept what we had, without upgrading, we would already be many millions of dollars better off, countless thousands of hours would not have been wasted, we would be far more motivated, we would be far more creative, and we would still have some brilliant people that we have lost as a direct result of this process. All of this would be of great benefit to our students and we would be moving forwards, not backwards. We have lost vital capacity to innovate, lost vital time to care about what we are supposed to be doing rather than working out how the machine works. The concept of a university as a machine is not a great one, though there are many technological elements and processes that are needed to make it run. I prefer to think of it like an ecosystem or an organism. As an online university, our ecosystem/body is composed of people and machines (tools, processes, methods, structures, rules, etc). The machinery is just there to support and sustain the people, so they can operate as a learning community and perform their roles in educating, researching and community engagement. The more that we have to be the machines, the less efficiently the machinery will run, and the less human we can all be. It's brutal, ugly, and self-destructive.

    When will we learn that the biggest costs of IT are to its end users, not to IT Services? We customized and created the tools that we have replaced for extremely good reasons: to make our university and its systems run better, faster, more efficiently, more effectively. Our ever-growing number of new off-the-shelf and outsourced systems, that take more of our time, intellectual and emotional effort, have wasted and continue to waste countless millions of dollars, not to mention huge costs in lost motivation and ill will, not to mention in loss of creativity and caring. In the process we have lost control of our tools, lost the expertise to run them, lost the capability to innovate in the one field in which we, as an online institution, must and should have most expertise. This is killing us. Technological debt is not voided by replacing custom parts with generic pieces. It is transferred at a usurious rate of interest to those that must replace the lost functionality with human labour.

    It won't be easy to reverse this suicidal course, and I would not enjoy being the one tasked with doing so. Those who were involved in implementing these changes might find it hard to believe, because it has taken years and a great deal of pain to do so (and it is far from over yet - the madness continues), but breaking the system was hundreds of times easier than it will be to fix it. The first problem is that the proprietary junk that has been foisted upon us, especially when hosted in the cloud, is a one-way valve for our data, so it will be fiendishly hard to get it back again. Some of it will be in formats that cannot be recovered without some data loss. New ways of working that rely on new tools will have insinuated themselves, and will have to be reversed. There will be plentiful down-time, with all the associated costs. But it's not just about data. From a systems perspective this is a Humpty Dumpty problem. When you break a complex system, from a body to an ecosystem, it is almost impossible to ever restore it to the way it was. There are countless system dependencies and path dependencies, which mean that you cannot simply start replacing pieces and assume that it will all work. The order matters. Lost knowledge cannot be regained - we will need new knowledge. If we do manage to survive this vandalism to our environment, we will have to build afresh, to create a new system, not restore the old. This is going to cost a lot. Which is, of course, exactly as Microsoft and all the other proprietary vendors of our broken tools count upon. They carefully balance the cost of leaving them against what they charge. That's how it works. But we must break free of them because this is deeply, profoundly, and inevitably unsustainable.

    Hello world!

    Connectivism blog (George Siemens) - March 8, 2016 - 08:11

    Welcome to WordPress. This is your first post. Edit or delete it, then start writing!