On the need for funding in UK Chemistry Higher Education

In 2014/2015, 18,495 people opted to study undergraduate chemistry at higher education in the UK. What do we know about their experience of learning chemistry?

A search of Web of Science for those based in the UK publishing about chemistry education in the period 2014, 2015, 2016 was conducted. This returned 88 hits. An initial screening reduced this number to 71. Those removed included things like returns for a “Wales” hit that was New South Wales, or book chapters that weren’t about chemistry teaching in classrooms, or authors based in the UK but writing about non-UK classrooms.

71

Of these 71, the results were categorised as follows. 7 papers referred to chemical engineering, or something specific about chemistry for engineers. While these may have value to chemists, they are not about teaching chemistry to chemists. Similarly, 1 paper was specifically discussing some detail of chemistry relevant to a pharmacy syllabus.

63

This left 63. A further 10 articles were about school chemistry. 6 were editorials, and 2 were reviews about some aspect of chemistry, which, while written in the UK, obviously extended their reach into international curricula. A further 3 were about informal chemistry; public or outreach or the presentation of chemistry in popular books.

42

So now we are left with 42 articles about chemistry education in higher education written by someone based in the UK. The majority of these – 22 – were about something to do with laboratory work; typically some new laboratory experiment, but occasionally some approach to doing something in the lab (e.g. alternative lab reports). A further 16 were categorised innovative ideas; unusual or novel approaches that are being reported because of novelty. The evaluation on whether such innovations are effective or not in this category tend to be based on student evaluations or questionnaires, and typically lack a formal research basis.

4

This leaves 4 articles published in 2014, 2015, 2016 that described something about the curriculum or the learners in higher education chemistry. 2 of these articles were written by Overton and Randles at Hull, both of whom are now abroad. The remaining 2 were on how well maths prepares students for studying chemistry – part of a large project looking at the relevance of maths for a variety of subjects; and on the design and evaluation of a polymer course. That’s it.

Chem Ed Publications in UK

We know nothin’

I argue then that we have no idea how students experience chemistry in higher education in the UK. We have no idea how well school chemistry prepares them, what their difficulties are, how they study, or what particular aspects of studying the UK chemistry degree are challenging. We’ve no idea how students experience lectures, what they learn in laboratories, nor how tutorials run. We don’t know how students balance workload, how they study, what affect part-time work has, nor whether students are able to discuss the relevance of chemistry to everyday life. We get occasional glimpses about issues around students’ employability, thanks to the work such as that done at Nottingham (CERP, 2017, outside the time boundary of this search) but we have never revisited the glory of Hanson and Overton, 2010. We will of course have copious amounts of data on students’ entry performance. We can guess that we will have normal distributions in grade data in our annual assessments, and that external examiners will generally be satisfied. Entry and output detail are all we can cling on to.

Around 19,000 chemistry students will be starting in September. We desperately need some funding to begin to find out in a serious way what experience awaits them.

Related Posts:

  • No Related Posts

200th Blog Post

This is my 200th blog post. Now I should say that, while I am impressed with that number, given that it is over seven years since Róisín Donnelly and Muireann O’Keeffe gently broached the idea of starting a blog, it is not a fantastic output rate: to borrow Kevin Bridges joke about losing 4 stone over 10 years, I don’t think I’ll be writing a book on how to blog.

I’m going to avoid the kind of post where I reflect on my blogging, think about what I’ve learned, and look with renewed wistful enthusiasm to the future: fail better! I’m also going to avoid my usual call to encourage others to blog, as my conversion rate is low. (Briefly: yes you have something to say, yes you can write, yes it is worth the time, and you’ll only be threatened with litigation once).

So instead I will celebrate by highlighting a few blogs I like to go to for my chemistry/education fix. I should say this isn’t meant to be encyclopedic, but rather the blogs that, when I see a new post, I will make time to read it. I have previously done more formal summaries for Nature Chemistry and Education in Chemistry.  (I see you have to pay $18 to read the Nature Chemistry one! Holy Mother of Jerusalem how did we get to this state?)

Read these blogs:

Katherine Haxton: Possibilities Endless – I love Katherine’s blog; it’s always a dose of pragmatism and reality, mixed in with something useful. She’s honest and funny and it is all very refreshing;  This is also one of the very few blogs left in the world where people seem to leave comments.

Blogs about chemistry teaching and evidence based practice

  • David Paterson: Thoughts on chemistry and education – think this is a newish blog. Given its title, its appeal is apparent. And it doesn’t disappoint. It’s clear that the author is someone who thinks very seriously about teaching and his blogs are wonderful summaries and conversations about those thoughts.
  • Kristy Turner: Adventures in chemistry education on both sides of the transition between school and HE – just like the title, Kristy is someone with a lot to say and this blog is one of the outlets she uses. Always some good insight and highlighting things you mightn’t have thought of.
  • Niki Kaiser: NDHS Blogspot – This is an amazing website and growing resource. I can’t actually keep up with it but Niki and others post stuff of on lots of aspects of teaching chemistry; cognitive science being the strand I follow. Very useful.

Blogs about chemistry education research and ongoing projects

This category is sadly not well occupied. How wonderful would it be to get updates and insights into the work people are doing. We tried with our badging lab skills site, and it got lots of interest, so despite promising not to, I do really encourage people to do it. Two nascent blogs in this category offering real hope are:

  • Stephen George-Williams Investigating the effects of Transforming Laboratory Learning – Stephen is updating about his PhD project which is centred around lab education. I think this is a great idea and it will be interesting to follow to see the kind of data gathered and the kinds of processes done with it.
  • Nimesh Mistry Mistry Research Group – Nimesh is also blogging about his observations as part of research in his group. His latest one documents questions students ask in the lab, and plans to think about how he will use those observations in planning lab design. More please!

I’m sure I have forgotten some and hope that if I have, I am reminded, so that I can apologise most profusely.

 

 

Related Posts:

  • No Related Posts

Mayer’s Principles: Using multimedia for e-learning (updated 2017)

Anyone involved in e-learning will know of the cognitive theory of multimedia learning, which draws together information processing model (dual coding), cognitive load theory (working memory), and the notion of active processing. You can read a little more of this in this (old) post.

Anyway, for most of us who don’t do full on e-learning, Mayer’s principles have value when we make things like videos or multimedia that we wish the students to interact with outside of their time with us. As such, Mayer’s principles, as reported in The Cambridge handbook of multimedia learning are well cited. Mayer has just published an update (HT to the wonderful new Twitter feed: https://twitter.com/CogSciLearning), and because I have nothing better to do than twiddle my thumbs for the summer (thank you Adonis), I made a graphic summarising the 12 principles he describes. Many seem obvious but that is probably no bad thing; as well as thinking about videos, there might be some lessons about PowerPointing here too. Click on the image to embiggen.

Mayer’s Principles: Using multimedia for e-learning (from Mayer, R. E. (2017) Using multimedia for e-learning. Journal of Computer Assisted Learning, doi: 10.1111/jcal.12197)
Mayer’s Principles: Using multimedia for e-learning (from Mayer, R. E. (2017) Using multimedia for e-learning. Journal of Computer Assisted Learning, doi: 10.1111/jcal.12197)

Related Posts:

  • No Related Posts

Lessons from a decade of ‘doing’ chemistry education

It’s been 10 years since “Developing practical chemistry skills by means of student-driven problem based learning mini-projects” was published in Chemistry Education Research and Practice, and it marked the kick-starting of an accidental career invested in chemistry education. This paper was published with two colleagues and friends, Claire Mc Donnell and Christine O’Connor, who inducted me into the ways of all things chem-ed. We would continue to work together; Claire and I guest-edited a special issue of CERP on technology in chemistry education in 2013, writing an editorial that is surprisingly cited quite often (for an editorial – I think it is because we say… something). And Christine and I wrote a book-chapter for the 2015 Wiley book on Chemistry Education, which in full respect to the editors, has quite a list of names assembled as authors. But wait: this is not an article about how great Seery is (that’s the next one, and the one before).

Capture

Can I say anything sensible about making this a profession? I’m often asked by people who are interested in teaching and learning and education-focussed careers what kinds of things they should think about to achieve this. I don’t know if I am qualified to answer that as much of what has happened was unplanned, fostered by a benevolent, and often indifferent department, working in a REFless, TEFless culture. Moving to Edinburgh has meant that this is now my ‘proper job’ rather than a hobby, but I wouldn’t call the transition or the journey a career path. So much of what follows is what I would advise, considering the REFful, TEFful culture we now live in. With that caveat, here are the top tips…

1. Separate the inner scholar from the teacher

One of the biggest difficulties for someone invested in teaching and learning chemistry is that all aspects of teaching and learning chemistry are probably of some interest. I remember going to conferences and wanting to see everything and #ohmygodthatssocoolwemusttrythat and being overwhelmed very quickly, because of course you don’t have time to try everything. You will not have time to think about everything at once, so my headline piece of advice is to identify what it is you will focus on; what will become your niche. You as a teacher will need to think about labs and lectures and tutorials and online marking and placement and professional development and…

But what will you as a scholar focus on? What is going to be the topic you will be able to have an intellectual basis in? Name it and begin to be strict with yourself about focussing on it. I see a lot of people who don’t produce any outputs even though they are doing good work because they are trying to do too many things.

Imagine an organic chemist. They of course know about most aspects of ongoing organic chemistry generally – they could teach any 2nd year course, but they specialise in their research on one or two particular aspects.

2. Read

If you are going to be scholarly about something, then you must read. I find it very surprising how little people read, or worse, how people cherry-pick some literature. Reading is important if we are going to move on from “gut feeling” or “in my experience” that drags down our academic standing. It is impossible to read everything, but that doesn’t mean you don’t read anything, and certainly doesn’t mean you rely on 140 character summaries as your academic insight. Twitter is amazing for pointing out unusual highlights and what other people who you respect consider important; and I have discovered countless gems that way. But you must be more systematic. This involves identifying a series of journals that you think are of interest and keeping up to date with what is published. If getting into a new area, it involves surveying the literature (hopefully finding a review!), finding out who the key players are. Reading also helps develop a kind of cultural capital – how do people go about things in this field; what are the acceptable norms? What the hell does being ethical mean?

How do you read? The challenge of reading 1000 papers might be a bit daunting. So of course you don’t need to read every line (except mine, for those: read every, single, line), but rather you are reading with a purpose. Perhaps you are making notes on how people implemented online quizzes in their courses. It doesn’t really matter if someone in University of West Nowhere scores went from 45.6% to 52.1%; what matters in this initial survey is what was their rationale and context, how did they go about it, how did they measure, what limitations did they state, and who did they cite to be of influence. You can very quickly build up a map of studies so that you now have a basis for designing your study on exploring how online quizzes; you can state what other people have done, give a rationale for your approach, and compare your results to others. Too often, this analysis is done post hoc. This is not a scholarly approach.

Reading also involves becoming familiar with learning theories. Again, just reading lots of learning theories is a passive way to approach this. Everybody is a constructivist because everybody is a constructivist. But what does that even mean? I thought you liked cognitivism too? How do you marry those thoughts?

3. Generate outputs

Many people do identify an area and do read, but never “get around” to publishing. This is tragic, because it means everyone has benefited from their scholarship except them. Developing outputs; at the very least conference presentations; is the only way the world (and also promotion and interview panels) know something exists. Everyone is a great teacher, everyone can quote some line of good feedback; don’t get that confused with the work of a scholar – producing some output to share with the world that is the result of academic work. The obvious output is a journal publication, but what if you made artefacts as part of some study – can you publish them online – maybe even have a link on the department website. Especially for those new to the field, this will be a useful indicator to show that you have demonstrated interest and will be a useful talking point at interviews.

One of the difficulties people find with writing outputs is that they don’t know how to write. They had a great idea, they got some nice results, and now they’ve got to make it look academic, which involves finding some references that look appropriate (See reading, above). Of course this is not the way to go about things. Our organic chemist does not just go into the lab one day and mix some things, happen across an interesting result, and then think about finding some sensible rationale as to why those chemicals were mixed. And it is unlikely that our educator was similarly flippant – there is likely some rationale in there but it makes life so much easier if the reading was done in advance to give that rationale some basis.

In my own experience, I cannot understate the value of keeping a blog has been to develop writing (yes I know this one is a long waffle). When you write something, you learn to think of how to present arguments, write a narrative, and in cases of academic blogging, have to have read something before writing about it. They say you don’t understand something until you teach it; trust me: you don’t understand something until you blog about it and expose your thoughts to the world. The world, in return, is usually grateful for you sharing those thoughts. And it all ends up being an accidental output.

4. Make friends

It’s nice to discuss your work and have support. One of the best things I had was the support of my original two co-authors. Claire and I went on to formalise this in a study we subsequently did and developed a critical friendship – one that was grounded in the knowledge that we both wanted what was best for each other, but not afraid to call the other up when something was awry about some aspect of the work or a conclusion. It was fantastic, not only for the actual conversations, but for the imagined ones too; I would wonder what Claire would think about something even before I would talk to her. This isn’t easy to find but worth seeking out. At the very least, connecting with others at conferences – yes we would all rather stand facing a corner and check our phone occasionally but come on now everyone, turn around. Talk. Introduce yourself. I was taken aback recently when someone I had always been afraid to talk to came up and introduced. We had a great chat and ended up with a group hug (it’s the way I roll). The point is, people go to these things because we have a common interest. Very often, there is someone else in the department who is interested in teaching. Talk with them (not at them).

One key aspect of making friends is the ability to listen. One thing that is slightly grating (to me) is that people will listen to you long enough to find out about how they can link onto something you are saying to something that they have done that is much better. This is not a way to learn about people and how they do things. Use your blog to show off. A better approach might be to think about how what the thing the person is talking about and what your own experiences are might marry into a useful collaboration? You’re a constructivist aren’t you?

In terms of making friends in the UK, the annual VICE conference is a good place to start. Registration deadline for this year is imminent (www.vicephec2017.com).

 

 

Related Posts:

  • No Related Posts

Student study approaches

Many thanks to Scott Lewis/USF who put an interesting paper on students’ approaches to study in introductory chemistry my way. The paper describes the development of a framework for learning approaches in chemistry, and they come up with four levels: (1) gathering facts; (2) learning procedures; (3) confirming understanding; and (4) applying ideas.

Do students know how to study? In an exam dominated system, one might reasonably expect students to focus on the second approach – if they learn the procedures and have the facts to hand, then they will be able to use these in an exam. Of course, as teachers we hope students will aspire to developing (and confirming) understanding, and begin to see how this understanding is useful in a wider sense.

So my initial reaction was to make a poster on these learning approaches. It seems that both levels 2 and 3 have similar outcomes; students are able to use procedures in assessments; but the motivations are very different for both. My first attempt at a poster is below, but having spent an afternoon making it and refining it, I am wondering if it now immediately needs a friend in the form of a poster detailing specific strategies. The intention is to (a) make it clear to students that different approaches exist, and then (b) give them some strategies (beyond rewriting notes) that they can put into place.  (Note I left level 4 off this poster for reasons I think I can defend).

More thought needed…

Study Approaches

Related Posts:

  • No Related Posts

Reflections on #MICER17

Two related themes emerged for me from the Methods in Chemistry Education Research meeting last week: confidence and iteration.

Let’s start where we finished: Georgios Tsaparlis’ presentation gave an overview of his career studying problem solving. This work emerged out of Johnstone’s remarkable findings around working memory and mental demand (M-demand).1,2 Johnstone devised a simple formula – if the requirements of a task were within the capability of working memory, students would be able to process the task; if not, students would find it difficult. This proposal was borne out of the plots of performance against complexity (demand) which showed a substantial drop at the point where M-demand exceeded working memory, and these findings seeded a remarkable amount of subsequent research.

However, things are seldom as simple as they seem and Tsaparlis’ work involved studying this relationship in different areas of chemistry – looking, for example, at how students solve problems in organic chemistry compared to physical chemistry, and the effect of the type of question. Each study was an iteration, an investigation of another aspect of this multi-dimensional jigsaw, aiming to make a little bit more sense each time. Sometimes the results led to an ill-fitting piece, with data being consigned to a desk drawer for a few years until further study allowed it to be explored in a new light. Towards the end of this arc of work, he began to move away from linear modelling, where we look at the strength of individual aspects on an overall hypothesis, to more complex models such as the “random walk”. It is another iteration.

The point to make here is there was no study that said: this is how students solve equilibrium questions. Rather, each study added a little more to understanding of a particular model framed around this understanding. Indeed Keith Taber outlined in his Ethics workshop the importance of context and situation in publishing results. Things are rarely definitive and usually context dependent.

For me this is reassuring. Just like Johnstone’s “ON-OFF” findings for working memory, there is a fear that one is either able to complete education research or one isn’t; a few participants indicated that “confidence” was one of the barriers in getting involved in education research in responding to Suzanne Fergus’ pre-meeting prompts, which guided her talk on writing research questions. I remember talking to an eminent chemistry professor who said something along the lines of “never look back!” – to just publish what you know to be your best understanding (and instrumentation) at a given time, accepting that more studies and analysis might lead to more understanding.

While this probably wasn’t intended to be as carefree as I leverage it here, there will always be one more publication, one better approach, one alternative understanding. The task then is to continually inform and develop our understanding of what it is we wish to understand. The action research cycles outlined more formally by Orla Kelly in her presentation facilitate this, although of course one might complete several cycles before considering publication. But I think iterations happen naturally as any research study progresses. Graham Scott illustrated this nicely in his presentation; later publications adding further depth to earlier ones. Stewart Kirton discussed building this iteration onto the design of research instruments.

Our task as education researchers then is to ensure that we are publishing to the best of or understanding and publishing with good intent – that we believe what we are saying at a particular time is an honest overview of our understanding of our study at that time in a given context.

Our task as practitioners is to move on from the duality of things that “work” and things that “don’t work”. The education literature isn’t unique in that it tends to publish more positive results than not, so when looking for the “best” way to do something, a keen reader may soon become overwhelmed, and even frustrated with the education literature for its lack of clarity. A task of those attending MICER then is not necessarily in translating research into practice; a common call, but rather communicating a greater awareness of the process of education research, along with how to meaningfully interpret outcomes so that they may be used – or not – in our teaching of chemistry.

 

We are grateful to Chemistry Education Research and Practice for their support of this event, along with the RSC interest groups: Tertiary Education Group and Chemistry Education Research Group.

Methods in Chemistry Education Research 2017 Welcome Slide

[1] J. Chem. Educ., 1984, 61, p 847

[2] Education in Chemistry, 1986, 23, 80-84

Related Posts:

  • No Related Posts

Wikipedia and writing

Academics have a complicated relationship with Wikipedia. There’s a somewhat reluctant acknowledgement that Wikipedia is an enormously used resource, but as the graphical abstract accompanying this recent J Chem Ed article1 shows, WE ARE NOT TOO HAPPY ABOUT IT. Others have embraced the fact that Wikipedia is a well-used resource, and used this to frame writing assignments as part of chemistry coursework.2-4  There is also some very elegant work on teasing out understanding of students’ perceptions of Wikipedia for organic chemistry coursework.5

Graphical abstract of M. D. Mandler, Journal of Chemical Education, 2017, 94, 271-272.
Graphical abstract of M. D. Mandler, Journal of Chemical Education, 2017, 94, 271-272.

Inspired by a meeting with our University’s Wikimedian in Residence I decided to try my hand at creating a Wikipedia article. The topic of the article was about a little-known chemist who hadn’t been written about before, and I’d say is unknown generally. I found her name listed on the Women in Red page, which is outside the scope of this post, save to say: go look at that page.

Writing the article was interesting, and some implications from a teaching perspective are listed:

  1. If there isn’t a Wikipedia article, writing a summary overview is quite a lot of work.

One of the great things about Wikipedia is of course that it offers a nice summary of the thing you are interested in, which then prompts you to go and look up other stuff which you can then pretend to have found originally. But what if there isn’t a Wikipedia article? Where do you start? Of course Googling and getting some information is part of this, but there is a step before, or at least coincident with this, which involves scoping out the content of what you want to summarise. This will involve reading enough so that you can begin this overview plan, and then searching to find information about the plan. In chemistry, the order of searching will likely go Google > Google Scholar > Databases like Web of Science etc > Google Books… Because of my context, I also got stuck into the RSC’s Historical Collection (a terribly under-promoted amazing resource). In any case, there is some good work to do here on developing information literacy (which in a formal education setting would probably need to be structured).

  1. #citethescheise

I was encouraged in writing to cite my work well, linking to original and verifiable sources. I am long enough in the game to know this, and may be known to advise novice academic writers to “referencify” their work for journals; the academic genre is one where we expect lots of superscript numbers to make a text look like it is well informed. Wikipedia has a very particular style where essentially every fact needs a citation. This is something I did reasonably well, but was very pleasantly surprised to see that someone else looked quite closely at these (new articles are reviewed by people who make amendments/changes). I know this because in my case I cited a modern J Mat Chem paper which offered an example of where the original contribution of my chemist had been cited about century later in 2016 (notability is a requirement in Wikipedia so I had this in mind). This reference had been checked, with the relevant line from it added to the citation. It was reassuring to know that someone took the time to consider the references in this amount of detail.

From a teaching point of view, we try in lab report and theses to encourage students to verify claims or opinion with data or literature. This seems like very good training for that. The point was also made to me that it teaches students to explore the veracity of what they read on Wikipedia, by considering the sources quoted.

  1. Learning to write

Wikipedia is an encyclopaedia (duh) and as such it has a particular style. I actually found it very difficult to write initially and went through quite a few drafts on Word with a view to keeping my piece pretty clinical and free of personal opinion.  Asking students to write Wikipedia articles will undoubtedly improve their writing of that style; I’m not too sure yet how beneficial that is; I feel the greater benefits are in information searching and citing, and in scoping out a narrative. But that is probably a personal bias. Edit: fair point made in this tweet: https://twitter.com/lirazelf/status/865124724166320128

  1. Writing to learn

Whatever about developing writing skills, I certainly learned a lot about my subject as well as much more context about the particular topic. Quite a lot of what I read didn’t make it into the final article (as it might have, for example if I were writing an essay). But as we know from preparing lecture notes, preparing a succinct summary of something means that you have to know a lot more than the summary you are presenting.

Why Wikipedia?

In challenging the arguments about Wikipedia such as those indicated in the graphical abstract above, I do like the idea of students getting to know and understand how the site works by interacting with it. Wikipedia usage is here to stay and I do think there is a strong argument around using it in academic writing and information literacy assignments. One very nice outcome is that something real and tangibly useful is being created, and there is a sense of contributing. Writing for something that is going to go live to the world means that it isn’t “just another exercise”. And Wikipedia articles always come to the top of Google searches (mine was there less than an hour after publishing).

Search view at 16:12 (left) after publishing, and at 16:43 (right)
Search view at 16:12 (left) after publishing, and at 16:43 (right).

I’m interested now in looking at Wikipedia writing, certainly in informal learning scenarios. A particular interest is going to be exploring how it develops information literacy skills and how we structure this with students.

My page, I’m sure you are dying to know is: https://en.wikipedia.org/wiki/Mildred_May_Gostling.

Lots of useful points about Wikipedia here – see Did You Know)

With thanks to Ewan McAndrew and Anne-Marie Scott.

Referencifying 

  1. M. D. Mandler, Journal of Chemical Education, 2017, 94, 271-272.
  2. C. L. Moy, J. R. Locke, B. P. Coppola and A. J. McNeil, Journal of Chemical Education, 2010, 87, 1159-1162.
  3. E. Martineau and L. Boisvert, Journal of Chemical Education, 2011, 88, 769-771.
  4. M. A. Walker and Y. Li, Journal of Chemical Education, 2016, 93, 509-515.
  5. G. V. Shultz and Y. Li, Journal of Chemical Education, 2016, 93, 413-422.

 

Related Posts:

  • No Related Posts

On the sublime

“Sublimity,” Hauptmann says, panting, “you know what that is, Pfennig?” He is tipsy, animated, almost prattling. Never has Werner seen him like this. “It’s the instant when one thing is about to become something else. Day to night, caterpillar to butterfly. Fawn to Doe. Experiment to result. Boy to man.”

All the Light We Cannot See, Anthony Doerr

Moonlight Scene, James Arthur O'Connor
Moonlight Scene, James Arthur O’Connor

Related Posts:

  • No Related Posts

Links to Back Issues of University Chemistry Education

I don’t know if I am missing something, but I have found it hard to locate past issues of University Chemistry Education, the predecessor to CERP.  They are not linked on the RSC journal page. CERP arose out of a merger between U Chem Ed and CERAPIE, and it is the CERAPIE articles that are hosted in the CERP back issues. Confused? Yes. (More on all of this here)

Anyway in searching and hunting old U Chem Ed articles, I have cracked the code of links and compiled links to back issues below. They are full of goodness. (The very last article published in UCE was the very first chemistry education paper I read – David McGarvey’s “Experimenting with Undergraduate Practicals“.)

Links to Back Issues

Contents of all issues: http://www.rsc.org/images/date_index_tcm18-7050.pdf 

1997 – Volume 1:

1 – remains elusive… It contains Johnstone’s “And some fell on good ground” so I know it is out there… Edit: cracked it – they are available by article:

1998 – Volume 2:

1 – http://www.rsc.org/images/Vol_2_No1_tcm18-7034.pdf

2 – http://www.rsc.org/images/Vol_2_No2_tcm18-7035.pdf

1999 – Volume 3:

1 – http://www.rsc.org/images/Vol_3_No1_tcm18-7036.pdf

2 – http://www.rsc.org/images/Vol_3_No2_tcm18-7037.pdf

2000 – Volume 4:

1 – http://www.rsc.org/images/Vol_4_No1_tcm18-7038.pdf

2 – http://www.rsc.org/images/Vol_4_No2_tcm18-7039.pdf

2001 – Volume 5:

1 – http://www.rsc.org/images/Vol_5_No1_tcm18-7040.pdf

2 – http://www.rsc.org/images/Vol_5_No2_tcm18-7041.pdf

2002 – Volume 6:

1 – http://www.rsc.org/images/Vol_6_No1_tcm18-7042.pdf

2 – http://www.rsc.org/images/Vol_6_No2_tcm18-7043.pdf

2003 – Volume 7:

1 – http://www.rsc.org/images/Vol_7_No1_tcm18-7044.pdf

2 – http://www.rsc.org/images/Vol_7_No2_tcm18-7045.pdf

2004 – Volume 8:

1 – http://www.rsc.org/images/Vol_8_No1_tcm18-7046.pdf

2 – http://www.rsc.org/images/Vol_8_No2_tcm18-7047.pdf

I’ve downloaded these all now in case of future URL changes. Yes I was a librarian in another life.

UCE logo

Related Posts:

  • No Related Posts

Dialogue in lectures

This is not a post on whether the lecture is A Good Thing or not. Lectures happen. PERIOD!

A paper by Anna Wood and colleagues at the Edinburgh PER group, along with a subsequent talk by Anna at Moray House has gotten me thinking a lot over the last year about dialogue and its place in all of our interactions with students. The literature on feedback is replete with discussion on dialogue, sensibly so. The feedback cycle could be considered (simplistically) as a conversation: the student says something to the teacher in their work; the teacher says something back to the student in their feedback. There’s some conversation going on. Feedback literature talks about how this conversation continues, but what about the bit before this conversation begins?

Not a monologue

The spark from Wood’s work for me was that lectures are not a monologue. She is considering active lectures in particular, but cites Bamford* who gives a lovely overview of the nature of conversation in lectures in general. Bamford presents the case that lectures are not a monologue, but are a conversation. Just as in the feedback example above, two people are conversing with each other, although not verbally. In a lecture, the lecturer might ask: “Is that OK?”. An individual student might respond inwardly “Yes, I am getting this” or “No, I haven’t a freaking clue what is going on and when is coffee”. A dialogue happened. Wood’s paper discusses these vicarious interactions – a delicious phrase describing the process of having both sides of the conversation; an internal dialogue of sorts. She describes how this dialogue continues in active lectures, but sadly there is only one Ross Galloway, so let’s think about how this conversation might continue in lectures given by us mere mortals. How can we help and inform these vicarious interactions?

Developing a conversation

A problem you will by now have identified is that the conversation: “Is that OK?” and retort isn’t much of a conversation. So how can we continue this conversation? My intention is to consider conversation starters in lectures that foster a sense with each individual student that they are having a personal conversation with the lecturer at points during the lecture. And incorporates guides for the student to continue this conversation after the lecture, up to the point that they submit their work, prompting the conversation we started with above.

In Woods talk, she mentioned specific examples. The lecturer would ask something like: “Is 7 J a reasonable answer?” A problem with “Is that OK?” is that it is too broad. It’s difficult to follow up the conversation specifically as it likely ends with yes or no.

How about a lecturer asks: “Why is this smaller than…?” You’re a student, and you’re listening. Why is it smaller? Do you know? Yes? No? Is it because…? Regardless of your answer, you are waiting for the response. You think you know the answer, or you know you don’t.

If we are to take dialogue seriously, then the crucial bit is what happens next. Eric Mazur will rightly tell us that we should have allow discussion with peers about this, but we are mortals, and want to get on with the lecture. So how about the conversation goes something like this:

“Why is this smaller than…?”

[pause]

You are a student: you will have an answer: You know, you think you know, you don’t know, you don’t know what’s going on. You will have some response.

The lecturer continues:

“For those of you who think…”

The lecturer responds with a couple of scenarios. The conversation continues beyond a couplet.

Did you think of one of these scenarios? If so the lecturer is talking to you. Yes I did think that and I have it confirmed now I am right. Or: yes I did think that, why is that wrong?

The lecturer can continue:

“While it makes sense to think that, have a look in reference XYZ for a bit more detail”.

The lecturer thus concludes this part of the conversation. A dialogue has happened and each student knows that that they have a good idea what is going on, they don’t but know where to follow up this issue, or that they haven’t a clue what is going on. Whichever case, there is some outcome, and some action prompted. Indeed one could argue that this prompted action (refer to reference) is a bridge between the lecture and tutorial – I checked this reference but don’t understand – and so the conversation continues there.

This all seems very obvious, and maybe everyone else does this and didn’t tell me. My lectures tend to have lots of “Is that OK?” type questions, but I do like this idea of a much more purposeful design to structuring the conversation with a large class. I should say that this is entirely without research base beyond what I have read, but I think it would be very empowering for students to think that a lecturer is aiming to have a conversation with them.

rf-freq-radio-wave

*Bamford is cited in Wood’s paper and I got most of it on Google Books.

Related Posts:

  • No Related Posts

Revising functional groups with lightbulb feedback

I’m always a little envious when people tell me they were students of chemistry at Glasgow during Alex Johnstone’s time there. A recent read from the Education in Chemistry back-catalogue has turned me a shade greener. Let me tell you about something wonderful.

The concept of working memory is based on the notion that we can process a finite number of new bits in one instance, originally thought to be about 7, now about 4.  What these ‘bits’ are depend on what we know. So a person who only knows a little chemistry will look at a complex organic molecule and see lots of carbons, hydrogens, etc joined together. Remembering it (or even discussing its structure/reactivity) would be very difficult – there are too many bits. A more advanced learner may be able to identify functional groups, where a group is an assembly or atoms in a particular pattern; ketones for example being an assembly of three carbons and an oxygen, with particular bonding arrangements. This reduces the number of bits.

Functional groups are important for organic chemists as they will determine the reactivity of the molecule, and a challenge for novices to be able to do this is to first be able to identify the functional groups. In order to help students practise this, Johnstone developed an innovative approach (this was 1982): an electronic circuit board.

Functional Group Board: Black dots represent points were students needed to wire from name to example of functional group
Functional Group Board: Black dots represent points were students needed to wire from name to example of functional group

The board was designed so that it was covered with a piece of paper listing all functional groups of interest on either side, and then an array of molecules in the middle, with functional groups circled. Students were asked to connect a lead from the functional group name to a matching functional group, and if they were correct, a lightbulb would flash.

A lightbulb would flash. Can you imagine the joy?!

Amide backup card
Amide backup card

If not, “back-up cards” were available so that students could review any that they connected incorrectly, and were then directed back to the board.

The board was made available to students in laboratory sessions, and they were just directed to play with it in groups to stimulate discussion (and so as “not to frighten them away with yet another test”). Thus students were able to test out their knowledge, and if incorrect they had resources to review and re-test. Needless to say the board was very popular with students, such that more complex sheets were developed for medical students.

Because this is 1982 and pre-… well, everything, Johnstone offers instructions for building the board, developed with the departmental electrician. Circuit instructions for 50 x 60 cm board were given, along with details of mounting various plans of functional groups onto the pegboard for assembly. I want one!

 

Reference

A. H. Johnstone, K. M. Letton, J. C. Speakman, Recognising functional groups, Education in Chemistry, 1982, 19, 16-19. RSC members can view archives of Education in Chemistry via the Historical Collection.

Related Posts:

  • No Related Posts

What is the purpose of practical work?

I have been reading quite a lot about why we do practical work. Laboratory work is a core component of the chemistry (science) curriculum but its ubiquity means that we rarely stop to consider its purpose explicitly. This leads to many problems. An interesting quote summarises one:

One of the interesting things about laboratories is that there has never been definite consensus about those serious purposes. Perhaps that is why they have remained popular: they can be thought to support almost any aim of teaching.1

Even within institutions, where their might be some prescription of what the purpose is in broad terms, different faculty involved in the laboratory may have different emphases, and subsequently the message about what the purpose of practical work is differs depending on who is running the lab on a given day.2

This matters for various reasons. The first is that if there is confusion about the purpose of practical work, then everyone involved will place their attention onto the part that they think is most important. Academics will likely consider overarching goals, with students developing scientific skills and nature of science aspects.3 Demonstrators will think about teaching how to use instruments or complete techniques. Students will follow the money, and focus on the assessment, usually the lab report, which means their time is best utilised by getting the results as quickly as possible and getting out of the lab.4 Everybody’s priority is different because the purposes were never made clear. As in crystalline.

The second reason that thinking about purposes is that without an explicit consideration of what the purposes of practical work are, it is difficult to challenge these purposes, and consider their value. How many lab manuals open up with a line similar to: “The purpose of these practicals is to reaffirm theory taught in lectures…”? The notion that the purpose of practicals is in somehow supplementing taught material in lectures has long come in for criticism, and has little basis in evidence. Laboratories are generally quite inefficient places to “teach” theory. Woolnough and Allsop argued vehemently for cutting the “Gordian Knot” between theory and practical, arguing that practical settings offered their own unique purpose that, rather than being subservient to theory work, complemented it.5 Kirchner picks this argument up, describing science education in terms of substantive structure and syntactical structure. The former deals with the knowledge base of science, the latter with the acts of how we do science.6 Anderson had earlier distinguished between “science” and “sciencing”.

Discussion therefore needs to focus on what this syntactical structure is – what is “sciencing”? Here, the literature is vast, and often contradictory. To make a start, we look to Johnstone who, with his usual pragmatism, distinguished between aims of practical work (what we set out to do) and objectives of practical work (what the students achieve).8 With this in mind, we can begin to have some serious discussion about what we want practical work to achieve in our curricula.

Links to source
Links to source

References

  1.  White, R. T., The link between the laboratory and learning. International Journal of Science Education 1996, 18 (7), 761-774.
  2. Boud, D.; Dunn, J.; Hegarty-Hazel, E., Teaching in laboratories. Society for Research into Higher Education & NFER-Nelson Guildford, Surrey, UK: 1986.
  3. Bretz, S. L.; Fay, M.; Bruck, L. B.; Towns, M. H., What faculty interviews reveal about meaningful learning in the undergraduate chemistry laboratory. Journal of Chemical Education 2013, 90 (3), 281-288.
  4. (a) DeKorver, B. K.; Towns, M. H., General Chemistry Students’ Goals for Chemistry Laboratory Coursework. Journal of Chemical Education 2015, 92 (12), 2031-2037; (b) DeKorver, B. K.; Towns, M. H., Upper-level undergraduate chemistry students’ goals for their laboratory coursework. Journal of Research in Science Teaching 2016, 53 (8), 1198-1215.
  5. Woolnough, B. E.; Allsop, T., Practical work in science. Cambridge University Press: 1985.
  6. Kirschner, P. A., Epistemology, practical work and academic skills in science education. Science & Education 1992, 1 (3), 273-299.
  7. Anderson, R. O., The experience of science: A new perspective for laboratory teaching. Teachers College Press, Columbia University: New York, 1976.
  8. Johnstone, A. H.; Al-Shuaili, A., Learning in the laboratory; some thoughts from the literature. University Chemistry Education 2001, 5 (2), 42-51.

 

Related Posts:

  • No Related Posts

Why do academics use technology in teaching?

This week is All Aboard week in Ireland, essayed at “Building Confidence in Digital Skills for Learning”. I am speaking today in the gorgeous city of Galway on this topic, and came across this paper in a recent BJET which gives some useful context. It summarises interviews with 33 Australian academics from various disciplines, on the topic of why they used technology in assessment. While the particular lens is on assessment, I think there are some useful things to note for those espousing the incorporation of technology generally.

Four themes emerge from the interviews

The first is that there is a perceived cost-benefit analysis at play; the cost of establishing an assessment process (e.g. quizzes) was perceived to be offset by the benefit that it would offer, such as reducing workload in the long-run. However, some responses suggest that this economic bet didn’t pay off, and that lack of time meant that academics often took quick solutions or those they knew about, such as multiple choice quizzes.

The second theme is that technology was adopted because it is considered contemporary and innovative; this suggests a sense of inevitability of using tools as they are there. A (mildly upsetting) quote from an interview is given:

“It would have been nice if we could have brainstormed what we wanted students to achieve, rather than just saying “well how can ICT be integrated within a subject?”

The third theme was one around the intention to shape students’ behaviour – providing activities to guide them through learning. There was a sense that this was expected and welcomed by students.

Finally, at the point of implementation, significant support was required, which often wasn’t forthcoming, and because of this, and other factors, intentions had to be compromised.

The authors use these themes to make some points about the process of advocating and supporting those integrating technology. I like their point about “formative development” – rolling out things over multiple iterations and thus lowering the stakes. Certainly my own experience (in hindsight!) reflects the benefit of this.

One other aspect of advocacy that isn’t mentioned but I think could be is to provide a framework upon which you hang your approaches. Giving students quizzes “coz it helps them revise” probably isn’t a sufficient framework, and nor is “lecture capture coz we can”. I try to use the framework of cognitive load theory as a basis for a lot of what I do, so that I have some justification for when things are supported or not, depending on where I expect students to be at in their progression. It’s a tricky balance, but I think such a framework at least prompts consideration of an overall approach rather than a piecemeal one.

There’s a lovely graphic from All Aboard showing lots of technologies, and as an awareness tool it is great. But there is probably a huge amount to be done in terms of digital literacy, regarding both the how, but also the why, of integrating technology into our teaching approaches.

map2
Click link to go to All Aboard webpage

 

Related Posts:

On learning “gains”

I have heard the term learning gain on and off but considered that it was just an awkward phrase to describe learning, a pleonasm that indicated some generally positive direction for those needing reassurance. And I am sure I have read about the concept of learning gain in the past but never took too much notice, likely too busy checking Twitter to focus on the subject at hand. But it was through Twitter this week that I was released from my gains-free bubble, and enough neurons aligned for me to grasp that learning gains is actually A Thing.

And what a horrendous Thing it is. HEFCE have a website dedicated to it and define it as “an attempt to measure the improvement in knowledge, skills, work-readiness and personal development made by students during their time spent in higher education”. If that isn’t clear, then how about this: a 125 page document published by RAND for HEFCE define learning gain (LG) as

LG = B – A’

They put it like this too – an equation on a separate line so that the scientists can nod along but the humanities people can just skip to the next piece of text. Good scientists should note that the variables should be italicised. Neither route offers any solace. This seemingly placid text masks one of the most remarkably awful ideas – gosh darn it THE most remarkable fuckwittery – I have come across in a long time.

Devoid of being able to monitor very much about teaching and learning in higher education, some genius has dreamt up this clanger. Rather than wondering why it is difficult (impossible) to achieve this task, the approach instead is to quantify with a number how much individual students have learned each year. A student will be measured at the start of the year, and again at the end of the year, and the difference is… well it’s obvious from the formula – there’ll be gainz. Each student’s own gain will be individual, so these poor lab rats exposed to this drudge are going to have to sit in an interview in a few years’ time and explain to someone what this garbage is about. Not only that, but this insidious beast is going to creep into every aspect of students’ learning. Degree transcripts will list a learning gain of 3.4 in career awareness and 2.9 in self-sufficiency. You were on a student society? Add 0.5 to your learning gain. Is this who we are now? How many centuries have universities existed, and now, at our supposed pinnacle, we have been reduced to this tripe. Don’t trust experts, Michael Gove said. Lots of very clever people are involved in this, and while I hope their intentions are good, I don’t trust the motives behind it. Writing an introduction to a special issue of HERD on measurement in universities, the editors cited Blaise:

A measurement culture takes root as if it is education because we are no longer sufficiently engaged with what a good education is. (HT)

HEFCE have funded several universities to wheel out creaking wooden horses, under the guise of pilot studies. It’s a cruel move, throwing some breadcrumbs at an education sector starved of research money to do anything. A horrible irony, it is, that in the name of ensuring value for money for taxpayers, that money is being spent on this. Don’t those involved see the next act in this tragedy? Institutional comparisons of average learning gains, far removed from the individual it was supposed to be personalised to, are easily computed. And numbers, whether they have meaning or not, are very powerful.

So when your student is sitting with you at the end of the year, and you have to measure their career awareness improvement, and your institution is under pressure as its gains are not as good as their protein-fuelled competitor, most likely a private operator in the dystopian near future the Tories are setting up, you might think again about their ability to discuss career pathways. Worse still is the difficulty in measuring a subject gain, or the even greater idiocy of using a proxy to measure something that can’t be meaningfully expressed as a number in the first place.

Another problem with measuring anything half as complicated as this, as a century of predictive studies has found, is that it is very easy to measure the large changes and the little changes, but the mass of students in the middle is mostly a muddle. And with LG = B – A’, there will be lots of muddle in the middle. What about the very bright student who actually knew most of the content at the start of the year because of prior schooling, but was shy and wanted to settle in for the first year of university. What about the one who enjoys just getting by and will focus in Year 3, but in the meantime has to be subject to self-efficacy tests to demonstrate improvement. What about the geniuses? What about first time in families? What about the notion that university is more than a number and the very darn fact that these are not things that we should try to measure.

I want my students to graduate mature and happy and adult and functioning members of society. If they learn some thermodynamics along the way, then that’s great. I can measure one of these things. It will be a sorry state of affairs if we have to measure others, and our future graduates, burdened with this heap of gunge won’t thank us for it.

Related Posts:

  • No Related Posts

Rounding up the peer review and digital badge project

Marcy Towns’ lovely paper from 2015 described the use of digital badges in higher education chemistry, specifically for assessment of laboratory skills. This work was important.  The concept of badges had been around for a while. When I first came across them while doing an MSc in E-Learning back in 2010, laboratory work seemed an obvious place to use them. But while the technology promised a lot, there wasn’t feasible systems in place to do it. And what exactly would you badge, anyway? And would undergraduate students really take a digital badge seriously?

Towns’ work was important for several reasons. On a systematic level, it demonstrated that issuing badges to a very large cohort of students in undergraduate laboratories was feasible. At Purdue, they used an in-house system called Passport to manage the badging process from submission of videos to viewing online for assessment, and subsequent issuing of the badges. But more importantly for me, Towns’ work reasserted the notion of laboratory skills and competencies as something worth assessing in their own right. Skills weren’t being implicitly assessed via quality of results or yield in a reaction. This approach – videoing a student while they demonstrate a technique was directly assessed. This is not something you come across very often in the education literature (some notable exceptions are pointed out in our paper).

This work answered two of the three questions I had about badges – there are systems in place (although as I discovered, Blackboard just about manages to do this, with a lot of creaking).  And there is scope for badging laboratory skills – the concept of badging is built on demonstrable evidence, and videoing techniques is part of this. Whether students take badging seriously I think still needs to be answered. My own sense is that there will need to be a critical mass of badges – an obvious ecosystem where it is clear to students how they can progress, and our own work in this regard is extending into more advanced techniques.

Incorporating peer review

One of the great insights Towns and her students shared at a workshop at BCCE last summer was the notion of narration in demonstrating techniques. Early videos in their initial pilot studies were eerily silent, and it was difficult for them to know what the students’ understanding was as they completed a technique – why they were doing things in a particular way? So they built in narration into the requirements of the demonstration. I think this is one of those things in hindsight that is obvious, but for us to know up front in our own implementation was invaluable.

We opted to have a system where students would video each other rather than be videoed by a demonstrator, and that the narration would be in effect one peer telling the other how they were doing the technique. To facilitate a review of this at the end, the demonstrators in the project came up with the idea of a peer observation sheet (they designed them too – bonus points!). The whole set up was to encourage dialogue – genuine interactions discussing the experimental technique, and allowing for feedback on this based on the guidelines presented in the peer observation sheets. These acted as a framework on which the lab was run. Lord knows chemists like instructions to follow.

Feedback then is given in-situ, and indeed if the student demonstrating feels after discussion that they would like to video it again, they can. This notion of quality, or exemplary work, is underscored by the exemplars provided to students in advance; pre-laboratory videos dedicated to correct display of technique. This whole framework is based around Sadler’s design for feedback, discussed… in the paper!

We’ve documented our on-going work on the project blog and the paper summarising the design and analysis of evaluation is now available in CERP.  It is part of the special issue on transferable skills in the curriculum which will be published in the Autumn, primarily as we felt it developed digital literacy skills in addition to the laboratory work; students were required to submit a link to the video they hosted online, rather than the video itself. This is giving them control over their digital footprint.

Resources

Resources – digital badges, peer observation sheets, links to exemplar videos – are all freely available on the project website.  I really think there is great scope for badges, and look forward to where this project will go next!

Three badges

Related Posts:

Using the Columbo approach on Discussion Boards

As pat of our ongoing development of an electronic laboratory manual at Edinburgh, I decided this year to incorporate discussion boards to support students doing physical chemistry labs. It’s always a shock, and a bit upsetting, to hear students say that they spent very long periods of time on lab reports. The idea behind the discussion board was to support them as they were doing these reports, so that they could use the time they were working on them in a more focussed way.

The core aim is to avoid the horror stories of students spending 18 hours on a report, because if they are spending that time on it, much of it must be figuring out what the hell it is they are meant to be doing. Ultimately, a lab report is a presentation of some data, usually graphically, and some discussion of the calculations based on that data. That shouldn’t take that long.

Setting Up

The system set-up was easy. I had asked around and heard some good suggestions for external sites that did this well (can’t remember it now but one was suggested by colleagues in physics where questions could be up-voted). But I didn’t anticipate so many questions that I would have to answer only the most pressing, and didn’t want “another login”, and so just opted for Blackboard’s native discussion board. Each experiment got its own forum, along with a forum for general organisation issues.

Use

A postgrad demonstrator advised me to allow the posts to be made anonymously, and that seemed sensible. Nothing was being graded, and I didn’t want any reticence about asking questions. Even anonymously, some students apologised for asking what they deemed “silly” questions, but as in classroom scenarios, these were often the most insightful. Students were told to use the forum for questions, and initially, any questions by email were politely redirected to the board. In cases close to submission deadlines, I copied the essential part of the question, and pasted it to the board with a response. But once reports began to be due, the boards became actively used. I made sure in the first weekend to check in too, as this was likely going to be the time that students would be working on their reports.

The boards were extensively used. About 60 of our third years do phys chem labs at a time, and they viewed the boards over 5500 times in a 6 week period. Half of these views were on a new kinetics experiment, which tells me as organiser that I need to review that. For second years, they have just begun labs, and already in a two week period, 140 2nd years viewed the board 2500 times. The number of posts of course is nowhere near this, suggesting that most views are “lurkers”, and probably most queries are common. Since students can post anonymously, I have no data on what proportion of students were viewing the boards. Perhaps it is one person going in lots, but given the widespread viewership across all experiments, my guess is it isn’t. The boards were also accessible to demonstrators (who correct all the reports), but I’ve no idea if they looked at them.

Reception

The reception from students has been glowing, so much so that it is the surprise “win” of the semester. (Hey, look over here at all these videos I made… No? Okay then!) Students have reported at school council, staff student liaison committees, anecdotally to me and other staff that they really like and appreciate the boards. Which of course prompts introspection.

Why do they like them? One could say that of course students will like them, I’m telling them the answer. And indeed, in many cases, I am. The boards were set up to provide clear guidance on what is needed and expected in lab reports. So if I am asked questions, of course I provide clear guidance. That mightn’t always be the answer, but it will certainly be a very clear direction to students on what they should do. But in working through questions and answers, I stumbled across an additional aspect.

One more thing

Me, when asked an electrochemistry question
Me, when asked an electrochemistry question

Everyone’s favourite detective was famous for saying: “oh: just one more thing“. I’ve found in the lab that students are very keen and eager to know what purpose their experiment has in the bigger context, where it might be used in research, something of interest in it beyond the satisfaction of proving, once again, some fundamental physical constant. And in honesty, it is a failing on our part and in the “traditional” approach that we don’t use this opportunity to inspire. So sometimes in responding to questions, I would add in additional components to think about – one more thing – something to further challenge student thought, or to demonstrate where the associated theory or technique in some experiment we were doing is used in research elsewhere. My high point was when I came across an experiment that used exactly our technique and experiment, published in RSC Advances this year. This then sparked the idea of how we can develop these labs more, the subject of another post.

Again I have no idea if students liked this or followed up these leads. But it did ease my guilt a little that I might not be just offering a silver spoon. It’s a hard balance to strike, but I am certainly going to continue with discussion boards for labs while I work it out.

embedded by Embedded Video

YouTube Direkt

Related Posts:

A tour around Johnstone’s Triangle

In a small laboratory off the M25, is a man named Bob. And Bob is a genius at designing and completing reactions on a very small scale. Bob is greatly helped by Dr Kay Stephenson, Mary Owen and Emma Warwick.

I was invited to go down to CLEAPPS to see Bob in action, and try out for myself some of the microscale chemistry he has been developing. I was interested to see it because of a general interest in laboratory expriments and how we can expand our repertoire. But I found out a lot more than just smaller versions of laboratory experiments.

Safety and cost considerations first piqued Bob’s interest in microscale. The traditional laboratory Hofmann voltmeter costs about £250, but the microscale version, including ingenious three way taps to syringe out the separated gases costs about £50. Thoughts about how to do a reduction of copper oxide safely led him to use a procedure that avoided traditional problems with explosions. There’s also a very neat version using iron oxide, incorporating the use of a magnet to show that iron forms.

Electrochemical production of leading to subsequent production of iodine and bromine. Copper crystals form on the electrode.
Electrochemical production of chlorine leading to subsequent production of iodine and bromine. Copper crystals form on the electrode.

Bob promised to show me 93 demonstrations in a morning (“scaled back from 94!”) and I worried on my way there that I would have to put on my polite smile after a while. But actually time flew, and as we worked through the (less than 93) experiments, I noticed something very obvious. This isn’t just about safety and cost. It has deep grounding in the scholarship of teaching and learning too.

Cognitive Load

What I remember from the session is not the apparatus, but the chemistry. Practical chemistry is difficult because we have to worry about setting up apparatus and this can act as a distraction to the chemistry involved. However, the minimal and often absence of apparatus meant that we were just doing and observing chemistry. This particularly struck me when we were looking at conductivity measurements, using a simple meter made with carbon fibre rods (from a kite shop). This, along with several other experiments, used an ingenious idea of instruction sheets within polypropylene pockets (Bob has thought a lot about contact angles). The reaction beaker becomes a drop of water, and it is possible to explore some lovely chemistry: pH indicator colours, conductivity, precipitation reactions, producing paramagnetic compounds, all in this way. It’s not all introductory chemistry; we discussed a possible experiment for my third year physical chemists and there is lots to do for a general chemistry first year lab, including a fabulously simple colourimeter.

Designing a universal indicator.
Designing a universal indicator.

Johnstone’s Triangle

One of the reasons chemistry is difficult to learn is because we have multiple ways of representing it. We can describe things as we view them: the macroscopic scale – a white precipitate forms when we precipitate out chloride ions with silver ions. We can describe things at the atomic scale, describing the ionic movement leading the above precipitation. And we can use symbolism, for example representing the ions in a diagram, or talking about the solubility product equation.  When students learn chemistry, moving between these “domains” is an acknowledged difficulty. These three domains were described by Alex Johnstone, and we now describe this as Johnstone’s triangle.

Johnstone's triangle (from U. Iowa Chemistry)
Johnstone’s triangle (from U. Iowa Chemistry)

One of my observations from the many experiments I carried out with Bob was that we can begin to see these reactions happening. The precipitation reactions took place over about 30 seconds as the ions from a salt at each side migrated through the droplet. Conductivity was introduced into the assumed unionised water droplet by shoving in a grain or two of salt. We are beginning to jump across representations visually. Therefore what has me excited about these techniques is not just laboratory work, but activities to stimulate student chatter about what they are observing and why. The beauty of the plastic sheets is that they can just be wiped off quickly with a paper towel before continuing on.

Reaction of ammonia gas (Centre) with several solutions including HCl with universal indicator (top right) and copper chloride (bottom right)
Reaction of ammonia gas (centre) with several solutions including HCl with universal indicator (top right) and copper chloride (bottom right)

Bob knew I was a schoolboy chemist at heart. “Put down that book on phenomenology” I’m sure I heard him say, before he let me pop a flame with hydrogen and reignite it with oxygen produced from his modified electrolysis apparatus (I mean who doesn’t want to do this?!). I left the room fist-bumping the air after a finale of firing my own rocket, coupled with a lesson in non-Newtonian liquids. And lots of ideas to try. And a mug.

I want a CLEAPPS set to be developed in time for Christmas. In the mean time, you can find lots of useful materials at: http://science.cleapss.org.uk/.

Related Posts:

Using digital technology to assess experimental science

The following was a pre-conference piece submitted to a Royal Society conference on assessment in practical science.

Summary

A modern laboratory education curriculum should embrace digital technologies with assessment protocols that enable students to showcase their skills and competences. With regards to assessment, such a curriculum should:

  • incorporate the digital domain for all aspects related to experimental work; preparations, activities, reflections;
  • provide a robust and valid assessment framework but with flexibility for individuality;
  • emphasise to students the role of documenting evidence in demonstrating skills and competences by means of micro-accreditation, such as digital badges.

This paper summarizes how some digital technologies can address the above points.

How can research into the use of digital technology in the assessment of experimental science improve the validity of assessment in the short, medium and long term?

Re-shifting the emphasis of assessment by means of e-assessment

Our use of digital technologies in everyday life has increased substantially in the last two decades, In contrast, laboratory education has remained stubbornly paper-based, with laboratory notebooks at the core of assessment protocols. This emphasis on a post-hoc report of work done, rather than a consideration of the work itself, means that the value of laboratory work has been distorted in favour of the process of preparing laboratory reports. Experimental work, and the demonstration of experimental skills and competences is of secondary importance.

There are good reasons why emphasis has historically been on the laboratory report instead of laboratory work. Directly assessing experimental work, and indeed any input students have to the planning and completion of experimental work, is subjective. Issues also arise if laboratory work is completed in groups, for either pedagogic or resource reasons. Assigning individual marks is fraught with difficulty.

Digital technologies can provide a basis to address many of the concerns regarding validity that the above issues raise, and provide an opportunity to reposition what is considered to be important in terms of the goals and purpose of experimental science.

The completion of experimental work typically involves:

  • Preparation: planning and preparing for work and making decisions on experimental approaches to be taken;
  • Action: learning how to carry out work competently, demonstrating competence in experimental approaches, and accurately recording data and/or observations;
  • Reflection: drawing conclusions from data, reporting of findings, and evaluation of approaches taken.

Incorporating the digital domain for all aspects of experimental work

Wikis and electronic laboratory notebooks are online document editing spaces that enable individual contribution to be documented and reviewed. Such platforms have been shown to allow the documentation of student thoughts and contributions to work, and as such they provide an excellent basis for recording the entire process (preparation, action, reflection) the student engages with while completing experimental work. Preparation can include a description of what equipment will be used and why, or thoughts on the purpose of experiment. Action can be documented by recording experimental work completed with the inclusion of data or observations in a variety of multi-media formats (text/photos/video/audio). Reflection can allow for a richer form of the typical lab report. In practice this means asking students to consider and review their experimental approach, so that the emphasis shifts away from the “right answer” (an often cited criticism of students coming through a school laboratory curriculum) and more towards a consideration of the approach taken.

Using traceability as a basis for validity of assessment

Validity is a core concern for a national high-stakes examination. Research to date on wikis has pointed to the advantages offered, including that student contributions are date stamped, and each individual contribution is logged. Overall contributions to work can be tracked. Rubrics have been effectively used to assess student laboratory skills, although the compilation of rubrics needs a considerable investment in order to document the desired goals and expectations of any particular curriculum experiment or inquiry so that they can be easily assessed. The value of a more flexible approach to documenting science work using wikis and electronic lab notebooks allows scope for individuality within an overall framework of requirements. However this is an area that needs considerable and ongoing research.

There is a body of research discussing the use of virtual laboratories for mimicking student experimentation, as they provide for more controlled and hence robust assessment protocols. These should be resisted, as they remove students’ exposure to the situational and psychomotor demands that being in the laboratory brings. While virtual laboratories may play some role in summative assessment – for example in decision making – they will likely act as a distraction to the necessary changes required to engaging with and documenting real hands-on work, as they will again shift the focus of experimental science away from actual laboratory work.

Emphasis on experimental science and documenting competences

An advantage of a refocus on documenting of processes means that there is an opportunity for students to showcase their own experimental skills. Digital badges have emerged as a way to accredit these, in what is known as “micro-accreditation”. Digital badges mimic the idea of Guides and Scouts badges by acknowledging achievements and competences in a particular domain. Examples could include badging students experimental skills (for example badges for pipetting, titrating, etc) and higher-level badges (for example badges where students would need to draw on a range of competences already awarded and apply them to a particular scenario (for example an overall analysis where students would need to design the approach and draw on their technical competency on pipetting and titrations). This enables students to document their own progress in an ongoing way, and allows them to reflect on any activities needed to complete a full set of badges on offer. This is an exciting area as it offers significant expansion across the curriculum. Mobile learning platforms will make new and interesting ways to develop these approaches.

Conclusions

Changing from paper based to electronic based media is not without difficulties. In terms of short, medium, and long-term objectives, an initial focus should begin with promoting the possibilities of documenting scientific work in school through the use of multimedia. This will develop a culture and expertise around the use of technical skills and work towards a medium term goal of developing a basis for documenting work in an online platform instead of on paper – emphasising the value of documenting evidence of processes. This can be complemented with the development of a suite of digital badges associated with expected experimental techniques and protocols. In the long term, this allows the consideration of assessment of laboratory work via wikis and electronic lab notebooks, using appropriate rubrics, which allow student to genuinely and accurately showcase their competence in experimental science in a much more meaningful and engaging way.

Related Posts:

  • No Related Posts

Video: e-portfolios for documenting learning in the 21st century

Here is a talk I gave to school teachers at an NCCA event in Dublin Castle earlier this year. It discusses:

  1. Why and how we should enable students create digital artefacts to document their learning?
  2. Practical approaches to doing this – discussing digital badges and wikis in particular.
embedded by Embedded Video

vimeo Direkt

Related Posts:

#ViCEPHEC16 – curly arrows and labs

The annual Variety in Chemistry Education/Physics Higher Education conference was on this week in Southampton. Some notes and thoughts are below.

Curly arrows

Physicists learned a lot about curly arrows at this conference. Nick Greeves‘ opening keynote spoke about the development of ChemTube3D – a stunning achievement – over 1000 HTML pages, mostly developed by UG students. News for those who know the site are that 3D curly arrow mechanisms are now part of the reaction mechanism visualisations, really beautiful visualisation of changing orbitals as a reaction proceeds for 30+ reactions, lovely visualisations of MOFs, direct links to/from various textbooks, and an app at the prototype stage. Nick explained that this has all been developed with small amounts of money from various agencies, including the HEA Physical Sciences Centre.

Mike Casey from UCD spoke about a resource at a much earlier stage of development; an interactive mechanism tutor. Students can choose a reaction type and then answer the question by drawing the mechanism – based on their answer they receive feedback. Version 2 is on the way with improved feedback, but I wondered if this feedback might include a link to the appropriate place in Chemtube3D, so that students could watch the associated visualisation as part of the feedback.

In the same session Robert Campbell spoke about his research on how A-level students answer organic chemistry questions. My understanding is that students tend to use rules of mechanisms (e.g. primary alkyl halides means it’s always SN2) without understanding the reason why; hence promoting rote learning. In a nice project situated in the context of cognitive load theory, Rob used Livescribe technology to investigate students reasoning. Looking forward to seeing this research in print.

Rob’s future work alluded to considering the video worked answers described by Stephen Barnes, also for A-level students. These demonstrated a simple but clever approach; using questions resembling A-level standard, asking students to complete them, providing video worked examples so students could self-assess, and then getting them to reflect on how they can improve. David Read mentioned that this model aligned with the work of Sadler, worth a read.

Laboratory work

Selfishly, I was really happy to see lots of talks about labs on the programme. Ian Bearden was the physics keynote, and he spoke about opening the laboratory course – meaning the removal of prescriptive and allowing students to develop their own procedures. Moving away from pure recipe is of course music to this audience’s ears and the talk was very well received. But you can’t please everyone – I would have loved to hear much more about what was done and the data involved, rather than the opening half of the talk about the rationale for doing so. A short discussion prompted this tweet from Felix Janeway, something we can agree on! But I will definitely be exploring this work more. Ian also mentioned that this approach is also part of physics modules taught to trainee teachers, which sounded a very good idea.

Jennifer Evans spoke about the prevalence of pre-labs in UK institutions following on from the Carnduff and Reid study in 2003. Surprisingly many don’t have any form of pre-lab work. It will be interesting to get a sense of what pre-lab work involves – is it theory or practice? Theory and practice were mentioned in a study from Oxford presented by Ruiqi Yu, an undergraduate student. This showed mixed messages on the purpose of practical work, surely something the academy need to agree on once and for all. There was also quite a nice poster from Oxford involving a simulation designed to teach experimental design, accessible at this link. This was also built by an undergraduate student. Cate Cropper from Liverpool gave a really useful talk on tablets in labs – exploring the nitty gritty of how they might work. Finally on labs, Jenny Slaughter gave an overview of the Bristol ChemLabs, which is neatly summarised in this EiC article, although the link to the HEA document has broken.

Other bites

  • Gwen Lawrie (via Skype) and Glenn Hurst spoke about professional development; Gwen mentioned this site she has developed with Madeline Schultz and others to inform lecturers about PCK. Glenn spoke about a lovely project on training PhD students for laboratory teaching – details here.  This reminds me of Barry Ryan‘s work at DIT.
  • Kristy Turner gave an overview of the School Teacher Fellow model at Manchester, allowing her to work both at school and university with obvious benefits for both. Kristy looked forward to an army of Kristy’s, which would indeed be formidable, albeit quite scary. Even without that, the conference undoubtedly benefits from the presence of school teachers, as Rob’s talk, mentioned above, demonstrates.
  • Rachel Koramoah gave a really great workshop on qualitative data analysis. Proving the interest in chemistry education research, this workshop filled up quickly. The post-it note method was demonstrated, which was interesting and will certainly explore more, but I hope to tease out a bit more detail on the data reduction step. This is the benefit of this model – the participants reduce the data for you – but I worry that this might in turn lead to loss of valuable data.
  • Matthew Mears gave a great byte on the value of explicit signposting to textbooks using the R-D-L approach: Read (assign a reading); Do (Assign questions to try); Learn (assign questions to confirm understanding). Matt said setting it up takes about 30 minutes and he has seen marked improvements in student performance in comparison to other sections of the course.
  • David Nutt won the best poster prize. His poster showed the results of eye-tracking experiments to demonstrate the value or not of an in-screen presenter. Very interesting results which I look forward to seeing in print.

The conference organisation was brilliant and thanks to Paul Duckmanton and Charles (Minion) Harrison for leading the organisation. Lots of happy but tired punters left on Friday afternoon.

I couldn’t attend everything, and other perspectives on the meeting with links etc can be found at links below. From Twitter, Barry Ryan’s presenation on NearPod seemed popular, along with the continuing amazingness of my colleagues in the Edinburgh Physics Education Research Group. One of their talks, by Anna Wood, is available online.

Related Posts: