Recent Posts

Featured Posts

Argument Analysis Wall last updated September 16, 2021 by Brian Plüss

Argument Analysis Wall

The idea

We aimed to make debates available on the Argument Web for all the different compatible online tools to access. Specifically, we wanted to analyse broadcast debate and support online interaction with those arguments. Live. To do it, we needed lots of analysts working together, using a large touch screen running bespoke software to collaboratively analyse the discourse. Stenographic transcription, argument segmentation and enthymeme reconstruction are all carried out by other team members. A short video of the result is available, and an unedited single-camera view of the full 45 minutes is also available. A more interesting, multi-camera video of the full analysis is also available.

The AnalysisWall

As far as we know, no-one has tried to do close argument analysis in real time. Compendium IBIS map facilitators are the closest, but that is at a much higher level. Our experience has suggested that the discourse of a 45 minute broadcast debate takes an analyst between one and two weeks to analyse. Analysis is also not easily parallelisable: there are too many interconnections between subparts. So the natural conclusion is that you need lots of pairs of hands and a large shared workspace.

So, on a shoestring, we built an FTIR touchscreen, 3.2m long, 2.4m high, rear projection at a resolution of 5760×2160, and developed an analysis application which outputs to AIF2. As our test case, we analyse episodes of the Moral Maze, broadcast on BBC Radio 4. The audio is transcribed by a stenographic service in London, and arrives with us as a text feed. It is then segmented into argument components by two ‘chunkers’ who take five minutes or so before swapping over (rather like simultaneous translators who typically spread the intense cognitive load). Output from the chunkers is handed on to the AnalysisWall application. A further app injects reconstructed enthymematic components (if, for example, analysts want to make a presumption explicit). Then between five and seven analysts work to tease apart the argumentative structure on the wall.

After a demo run the week before, our first full live deployment was conducted at 8pm on 18 July 2012, analysing the sixth episode in the summer 2012 season of the Moral Maze. The full video of our analysis will be available shortly; a five-minute video of the highlights is available now. The result of the analysis is online as a part of the Argument Web.

The hardware

The screen is formed from a 10mm thick acrylic sheet over which is hung a layer of translucent paper backed by three coats of a xylene-silicone mix which form a compliant surface and allow gesture tracking. IR LEDs are sunk and glued into 8mm deep holes drilled at 20mm centres into the acrylic along the top and bottom edges. The projected image is produced with six bargain-basement 1080p projectors painstakingly aligned, colour-balanced and tweaked (no image integration or edge blending). The projectors are driven by a Matrox 6-head card in a custom PC responsible for running the AnalysisWall software and nothing else. Detection uses the CCV package with IR input from six PS2 eye webcams each covering roughly one projected image area, yielding a detection resolution of around 1920×960 at (in practice) around 30fps. The rate of data delivery from the cameras demands a USB3 bus, and unexplained interference between CCV, OS and Python meant that it made sense to use a separate PC for CCV, and then send the TUIO events over the network.

Wiring Setting IR LEDs The frame In situ

The team

The AnalysisWall has been designed and constructed by the Argumentation Research Group (School of Computing, University of Dundee) led by Prof. Chris Reed.

It forms a part of the EPSRC-funded Dialectical Argumentation Machines project.

Hardware construction, software development, testing and deployment of the AnalysisWall relied upon the hard work of Marcelo Acuna, Mike Beattie, Floris Bex, Adam Brown, Katarzyna Budzynska, John Lawrence, Emily McDonald, Kari McMahon, Phil Quinlan, Mark Snaith and Mark Zarb.

L-R: Adam Brown, Kari McMahon, Emily McDonald L-R: Mike Beattie, Mark Snaith, John Lawrence, Katarzyna Budzynska L-R: Adam Brown, Floris Bex, Mark Snaith, Chris Reed, Mike Beattie

ARG:dundee would also like to acknowledge the support of Derek Brankin and Mahamadou Niakate, who provided technical support in the School of Computing, and Alan Clark and his team at the University of Dundee who delivered the carpentry work.

Finally, we are enormously grateful for the continued enthusiasm and support for this work offered by Christine Morgan, Head of Religion Radio, and executive producer at the BBC for the Moral Maze.