SenseCam

SenseCam

Established: February 25, 2004

SenseCam is a wearable camera that takes photos automatically. Originally conceived as a personal ‘Black Box’ accident recorder, it soon became evident that looking through images previously recorded tends to elicit quite vivid remembering of the original event. This exciting effect has formed the basis of a great deal of research around the world using SenseCam, the Vicon Revue and OMG Autographer (both produced for several years under license) and a number of other similar devices.

Highlights of the history of SenseCam, the Vicon Revue and the OMG Autographer include:

Introduction to SenseCam

SenseCam is a wearable digital camera that is designed to take photographs passively, without user intervention, while it is being worn. Unlike a regular digital camera or a cameraphone, SenseCam does not have a viewfinder or a display that can be used to frame photos. Instead, it is fitted with a wide-angle (fish-eye) lens that maximizes its field-of-view. This ensures that nearly everything in the wearer’s view is captured by the camera, which is important because a regular wearable camera would likely produce many uninteresting images.

SenseCam also contains a number of different electronic sensors. These include light-intensity and light-color sensors, a passive infrared (body heat) detector, a temperature sensor, and a multiple-axis accelerometer. These sensors are monitored by the camera’s microprocessor, and certain changes in sensor readings can be used to automatically trigger a photograph to be taken.

For example, a significant change in light level, or the detection of body heat in front of the camera can cause the camera to take a picture. Alternatively, the user may elect to set SenseCam to operate on a timer, for example taking a picture every 30 seconds. We have also experimented with the incorporation of audio level detection, audio recording and GPS location sensing into SenseCam although these do not feature in the current hardware.

In our current design (v2.3), users typically wear the camera on a cord around their neck, although it would also be possible to clip it to pockets or belts, or to attach it directly to clothing. There are several advantages of using a neck-cord to wear the camera. First, it is reasonably stable when being worn, as it tends not to move around from left-to-right when the wearer is walking or sitting. Second, it is relatively comfortable to wear and easy to put on and take off. Third, when worn around the neck, SenseCam is reasonably close to the wearer’s eyeline and generates images taken from the wearer’s point of view – i.e., they get a ‘first person’ view. Informal observations suggest that this results in images that are more compelling when subsequently replayed.

SenseCam takes pictures at VGA resolution (640×480 pixels) and stores them as compressed .jpg files on internal flash memory. We currently fit 1Gb of flash memory, which can typically store over 30,000 images. Most users seem happy with the relatively low-resolution images, suggesting that the time-lapse, first-person-viewpoint sequences represent a useful media type that exists somewhere between still images and video. It also points to the fact that these are used as memory supports rather than rich media. Along with the images, SenseCam also stores a log file, which records other sensor data along with their timestamps. Additional user data, such as time-stamped GPS traces, may be used in conjunction with the SenseCam data via time-correlation.

Reviewing and Sharing SenseCam Images

The data recorded by the SenseCam can be downloaded onto a desktop or laptop computer, typically at the end of a day or week. Microsoft Research developed a simple viewer application that can be used to transfer the images in this way and then display them. The basis of the viewer, which is designed to be very straightforward to use, is a window in which images are displayed and a simple VCR-type control which allows an image sequence to be played slowly (around 2 images/second), quickly (around 10 images/second), re-wound and paused.

The fast-play option creates a kind of ‘flip-book’ movie effect – the entire event represented by the images is replayed as a time-compressed movie. Such rapid serial visual presentation (RSVP) techniques are well-studied in psychological literature1 and are particularly suited to SenseCam images. It is possible to delete individual images from the sequence if they are badly framed or of poor quality. An additional option is provided to correct for the ‘fish-eye’ lens effect using an algorithm, which applies an inverse model of the distortion.

It is also possible to import SenseCam image sequences into a more sophisticated application. MyLifeBits will allow the large number of images generated daily to be easily searched and accessed. Dublin City University have developed a sophisticated SenseCam image browser which assists in splitting sequences of images into different events by automatically analysing the images and sensor data generated by SenseCam.

Using SenseCam to Alleviate Memory Loss

Early on in the development of SenseCam, we became aware of the work of the Memory Clinic and Memory Aids Clinic at Addenbrooke’s Hospital, Cambridge, UK. This is a centre of excellence in the UK for diagnosing various conditions that affect memory, and for working with patients to try and mitigate their symptoms. While there are established techniques to help people remember to do things (i.e. supplement their prospective memory), there are very few aids that complement autobiographical memory, i.e. support the remembrance of things done or experienced. The Memory Clinic was excited by the potential of SenseCam to help in this regard.

In around 2005 we started a trial with a 63-year-old patient from the clinic with amnesia resulting from a brain infection. The patient, Mrs. B, was given a SenseCam and asked to wear it whenever she anticipated a ‘significant event’ – the sort of event that she would like to remember (i.e. not just something routine or mundane).

After wearing SenseCam for the duration of such an event, Mrs. B would spend around one hour reviewing the images every two days, for a two-week period.

Without any aids to recall, Mrs. B typically completely forgets everything about an event after five days or less. However, during the course of this period of assisted recall using SenseCam, Mrs. B’s memory for the event steadily increased, and after two weeks she could recall around 80 percent of the event in question. What is perhaps more remarkable is that following the two-week period of aided recall, Mrs. B appears to have a lasting ability to recall the event even without reviewing the images.

The results of that initial trial with SenseCam are shown here:

graph_1

Following the success of the first trial and the excitement it generated in both the research and clinical rehabilitation communities, Microsoft Research made SenseCam devices available to a large number of researchers and also initiated additional trials related to SenseCam’s use as a memory aid. Using SenseCam seems to be a very positive experience for most of the patients involved. Many have reported enjoying using it and reviewing images of their experiences, explaining that it makes them feel much more confident and relaxed. This is in stark contrast to the use of a written diary, which patients typically report has the opposite effect. Carers have also reported that they find SenseCam very beneficial. Here are some of the things that patients and their carers have said about SenseCam:

  • “I am less anxious, because it helps to settle, or verify, what actually happened…”
  • “It has enormous potential as a memory aid and has been a great success for us personally”
  • “Looking at the images is definitely helpful… they cue memories of things I would normally just forget”
  • “SenseCam is a Godsend… everyone should have one!”
  • I am “more relaxed socially and less anxious”
  • “Sharing experiences again is a sheer pleasure”

Microsoft has provided over $0.5M funding including SenseCam devices, software and support to facilitate collaborative research projects with academic and clinical memory experts around the world. Some of these projects, which broadly aim to address specific research questions and further our understanding of how SenseCam appears to give such dramatic results in improving memory recall, are listed below:

  • SenseCam in the study and support of memory in Transient Epileptic Amnesia Professor Adam Zeman, University of Exeter, UK
  • SenseCam-facilitated recollection in patients with dementia Professor Phil Barnard, Medical Research Council, Cambridge, UK, & Dr Linda Clare, University of Bangor, Wales, UK
  • Why and how are SenseCam movies such a powerful aid to memory? Locating the brain basis of memory improvement Professor Roberto Cabeza, Duke University, US, and Professor Martin Conway, University of Leeds, UK
  • Enhancing quality of life in Alzheimer’s Disease with automatic SenseCam records of days in one’s life Professor Ron Baecker, University of Toronto and Professor Yaakov Stern, Columbia Medical School, US
  • Evaluation of SenseCam as a tool for aiding executive self-monitoring and control of emotion and behaviour after brain injury Fergus Gracey, The Oliver Zangwill Centre for Neuropsychological Rehabilitation, Ely, UK
  • Evaluation of SenseCam as a retrospective memory compensation aid following acquired brain injury David Winkelaar, Psychologist, The Halvar Jonson Centre for Brain InjuryPonoka, Alberta, Canada
  • SenseCam as a Tool to Study Memory Processes in Autobiographical MemoryProfessor William F. Brewer, Professor Aaron S. Benjamin, and Jason R. Finley, Department of Psychology, University of Illinois, Urbana-Champaign, US

Other Applications for SenseCam

In addition to the use of SenseCam as an aid for people with memory loss, the device has a number of other potential applications. In 2005, Microsoft provided some of the first SenseCams to a number of academic collaborators interested in the general area of ‘digital memories’, i.e. life-recording or life-logging. These projects applied SenseCam in a variety of ways. For example CLARITY, the Centre for Sensor Web Technologies at Dublin City University, Ireland, is working on systems that will automatically generate ‘landmark images’ through analysis of the large number of images and other logged data recorded by SenseCam. In this way a personalized memory experience of a visit to a museum, national monument, etc. can be automatically generated, based on data collected by SenseCams worn during the visit. The CLARITY Centre has also done a huge range of additional research related to SenseCam.

We are also working with Dr Charlie Foster and his colleagues from the Health Promotion Research Group at Oxford University, UK. This work, funded in part by the British Heart Foundation, looks at the relationship between the environment and physical activity – for example how effective the provision of cycle lanes is in encouraging people to leave their cars at home. SenseCam can be useful as a means to measure various aspects of the of the environment and the amount of exercise people take. The group is also using SenseCam as a tool to record food choices and eating habits.

We worked with the Universities of Nottingham and Bath, the BBC, BT and two small companies, Blast Theory and ScienceScope as part of a project called Participate. The purpose of Participate is to design, develop and test the utility of novel, pervasive, lightweight and wearable technologies that support mass participation in science, education, art and community life. SenseCam has been used by a number of school children as part of this project. In a separate piece of work, SenseCam has been used in the classroom to enable teachers to create a log of their day, supporting various aspects of reflective practice and thereby enabling users of the device to analyse their day afterwards. SenseCam has also been used in an office environment to support studies of how office workers spend their day, and in particular how they manage to work simultaneously on different tasks.

Collaborations with a number of other researchers around the world to further explore yet more potential usages for SenseCam include:

  • As a tool to assess accessibility issues encountered by wheelchair users.
  • To coordinate disaster response by recording visual information encountered by those responding to disasters, people preoccupied with providing hands-on help.
  • As an automatic diary that doesn’t require expensive, intrusive recording equipment or restrict a user’s activities.
  • A non-intrusive market research tool.
  • To monitor physiological data to help patients understand the sequence of events that precedes a period of intense anxiety or anger.
  • To monitor lighting conditions in schools and to learn how they affect students.
  • Capturing personal experiences for sharing with others.”

More Information and Downloads

There is an active SenseCam research community, which meet annually at the SenseCam Symposium. For more information please visit the SenseCam wiki. There is also a Wikipedia page on SenseCam. SenseCam is on display at the London Science Museum in their ‘Who am I?’ gallery. Videos which describe SenseCam are available.

The SenseCam is available to buy as the Vicon Revue. In addition to the software for viewing images that Vicon supply, the original Microsoft Research Image Viewer is available, as is a more advanced viewer from Dublin City University.

Images of the SenseCam device and sample images taken with a SenseCam are available to download. We also have a selection of quotes from some of our collaborators.

For more information about SenseCam research please contact us at sensecam@microsoft.com

Media Coverage

Q&A

How many images does the SenseCam take?

SenseCam typically takes a picture every 30 seconds, although this is user-configurable. The maximum rate of capture is one image every 5 seconds. With a 1Gb storage card fitted inside the device, it is capable of storing over 30,000 images which in practical terms is a week or two’s worth of pictures. When the internal storage is full, the images must be downloaded to a PC.

How long does the battery last?

The rechargeable battery in the SenseCam will run continuously for around 24 hours when it’s capturing an image every 30 seconds or so. It takes around 3 hours to recharge using a USB connection to a PC or a mains adapter.

How do you use the sensor data?

Data is from the various sensors in the SenseCam is collected continuously and recorded on the internal storage card. SenseCam also uses information from the sensors to trigger additional image capture, beyond the ‘image every 30 seconds’ which is captured in any case. For example, if the SenseCam has been stationary for some time as a result of being put down somewhere for example, the PIR sensor will be used to detect people coming into view and this will trigger additional photos to be taken. In some applications, for example our work with patients who have memory loss conditions, simple timed-triggering may well be sufficient.

The Sensor data may also be used after the event to facilitate various types of automatic analysis of a sequence of images. A good example of this is automatic landmark generation research.

Who invented SenseCam? Who worked on the project?

Whilst working at Microsoft Research, Lyndsay Williams initiated the first prototype of SenseCam in 2003, motivated by the idea of a ‘black box’ accident recorder for people. Since then a large number of people at Microsoft Research have evolved the project very significantly. Steve Hodges designed the SenseCam device and led an initiative to disseminate these around the world for research into a number of different aspects of memory, activity and nutrition monitoring, market research, and other topics. This device has also been commercialised by Vicon as the Revue and by the OMG group as the Autographer. Others involved in various aspects of hardware and software development, evaluation and experimentation include: Emma Berry, Georgina Browne, Alex Butler, Rowanne Fleck, Andrew Fogg, Richard Harper, Steve Hodges, Shahram Izadi, Matt Lee, Mike Massimi, Narinder Kapur, Dave Randall, Alban Rrustemi, James Scott, Abigail Sellen, Gavin Smyth, James Srinivasan, Trevor Taylor and Ken Woodberry. SenseCam and all associated intellectual property is owned by Microsoft Research.

People

Videos

Publications reporting work with SenseCam

The following publications report work with SenseCam or its direct descendants, the Vicon Revue and OMG Autographer devices which were manufactured under licence. These include eight PhD theses describing research which leveraged SenseCam.

Note that work presented at the first three SenseCam Conferences, SenseCam 2012, SenseCam 2010 and SenseCam 2009, is largely not included because these were not archival events. Papers included at SenseCam 2013 were, however, published by the ACM and are listed below.

  1. Alessandro Perina, Sadegh Mohammadi, Nebojsa Jojic and Vittorio Murino. Summarization and classification of wearable camera streams by learning the distributions over deep features of out-of-sample image sequences. The IEEE International Conference on Computer Vision (ICCV 2017),  pp. 4326-4334. October 2017.
  2. Schrempft, Stephanie, Cornelia H.M. van Jaarsveld, and Abigail Fisher. Exploring the Potential of a Wearable Camera to Examine the Early Obesogenic Home Environment: Comparison of SenseCam Images to the Home Environment Interview. Journal of Medical Internet Research 19(10): e332. October 2017.
  3. Jonathan Gershuny, Teresa Harms, Aiden Doherty, Emma Thomas, Karen Milton, Paul Kelly and Charlie Foster. CAPTURE24: Testing self-report time-use diaries against objective instruments in real time. Centre for Time Use Research, Department of Sociology, University of Oxford. October 2017.
  4. L.N. Signal, J. Stanley, M. Smith, M.B. Barr, T.J. Chambers, J. Zhou, A. Duane, C. Gurrin, A.F. Smeaton, C. McKerchar, A.L. Pearson, J. Hoek, G.L.S. Jenkin and C. Ni Mhurchu. Children’s everyday exposure to food marketing: an objective analysis using wearable cameras. International Journal of Behavioral Nutrition and Physical Activity, volume 14 issue 1. October 2017.
  5. Louise N. Signal, Moira B. Smith, Michelle Barr, James Stanley, Tim J. Chambers, Jiang Zhou, Aaron Duane, Gabrielle L.S. Jenkin, Amber L. Pearson, Cathal Gurrin, Alan F Smeaton, Janet Hoek and Cliona Ni Mhurchu. Kids’ Cam: An objective methodology to study the world in which children live. American Journal of Preventive Medicine, Volume 53, Issue 3, pp. e89-e95. September 2017.
  6. Dang Nguyen, Duc Tien, Luca Piras, Michael Riegler, Giulia Boato, Liting Zhou and Cathal Gurrin. Overview of ImageCLEF lifelog 2017: lifelog retrieval and summarization. In: ImageCLEF 2017, 11-13 Sept 2017, Dublin. September 2017.
  7. Nora E. Miller, Whitney A. Welch, Aiden Doherty and Scott J. Strath. Accuracy Of Behavioral Assessment With A Wearable Camera in Semi-structured And Free Living Conditions In Older Adults: 2307 Board #320 June 1 2. Medicine & Science in Sports & Exercise. 49(5S):651. DOI: 10.1249/01.mss.0000518714.94905.35 May 2017.
  8. A.R. Silva, M.S. Pinho, L. Macedo, C.J.A. Moulin. The cognitive effects of wearable cameras in Mild Alzheimer disease-An experimental study. Current Alzheimer Research 14(12) pp. 1270-1282. May 2017.
  9. Tiffany E. Chow, Jesse Rissman. Neurocognitive mechanisms of real-world autobiographical memory retrieval: insights from studies using wearable camera technology. Annals of the New York Academy of Sciences, volume 1396, pp. 202-221. May 2017.
  10. Ana Rita Silva, Maria Salomé Pinho, Luís Macedo, Christopher Moulin, Salomé Caldeira and Horácio Firmino. It is not only memory: effects of sensecam on improving well-being in patients with mild alzheimer disease. International Psychogeriatrics, volume 29, issue 5, pp. 741-754, Cambridge University Press. May 2017.
  11. Gemma Wilson. Examining the differences between the use of wearable cameras and traditional cameras in research – a research note. International Journal of Social Research Methodology Vol. 20, Issue 5.  April 2017.
  12. Ali Mair, Marie Poirier and Martin A. Conway. Supporting older and younger adults’ memory for recent everyday events: A prospective sampling study using SenseCam. Consciousness and Cognition, volume 49, pages 190-202. March 2017.
  13. Mélissa Allé, Liliann Manning, Jevita Potheegadoo, Romain Coutelle, Jean-Marie Danion and Fabrice Berna. Wearable Cameras Are Useful Tools to Investigate and Remediate Autobiographical Memory Impairment: A Systematic PRISMA Review. Neuropsychology Review, volume 27, issue 1, pp. 81-99. March 2017.
  14. Lydia Dubourg, Ana Rita Silva, Christophe Fitamen, Chris J.A. Moulin, Céline Souchay. SenseCam: A new tool for memory rehabilitation? Revue Neurologique, volume 172, issue 12,  pp. 735-747. December 2016.
  15. Camille Nebeker, Tiffany Lagare, Michelle Takemoto, Brittany Lewars, Katie Crist, Cinnamon S. Bloss and Jacqueline Kerr. Engaging research participants to inform the ethical conduct of mobile imaging, pervasive sensing, and location tracking research. Translational Behavioral Medicine, volume 6, issue 4, pp 577–586. December 2016.
  16. Amar Dhand, Alexandra E. Dalton, Douglas A. Luke, Brian F. Gage and Jin-Moo Lee. Accuracy of Wearable Cameras to Track Social Interactions in Stroke Survivors. Journal of Stroke and Cerebrovascular Diseases, volume 25, issue 12, pp. 2907-2910. December 2016.
  17. Wilson, G, Jones, D, Schofield, P & Martin, DJ. Experiences of using a wearable camera to record activity, participation and health-related behaviours: Qualitative reflections of using the Sensecam. Digital Health volume 2: 1-11. November 2016.
  18. E. Thomas, T. Harms, K. Milton, P. Kelly, A. Doherty, J. Gershuny and C. Foster. Reconstructing Time Use to Understand Human Behavior: Combining Accelerometry, Wearable Cameras, Diaries and Interviews. International Journal of Behavioral Medicine, volume 23, pp 236-237, Springer. November 2016.
  19. Ruth Bartlett, Andrew Balmer, Petula Brannelly. Digital technologies as truth-bearers in health care. Nursing Philosophy, volume 18, issue 1, Wiley. November 2016.
  20. Alan F. Smeaton, Kevin McGuinness, Cathal Gurrin, Jiang Zhou, Noel E. O’Connor, Peng Wang, Brian Davis, Lucas Azevedo, Andre Freitas, Louise Signal, Moira Smith, James Stanley, Michelle Barr, Tim Chambers, and Cliona Ní Mhurchu. Semantic Indexing of Wearable Camera Images: Kids’Cam Concepts. In Proceedings of the 2016 ACM workshop on Vision and Language Integration Meets Multimedia Fusion (iV&L-MM ’16). October 2016.
  21. Na Li, Cathal Gurrin, Martin Crane and Heather J. Ruskin. NTCIR-12 Lifelog Data Analytics. In Proceedings of the first Workshop on Lifelogging Tools and Applications (LTA ’16), pp. 27-36. October 2016.
  22. Soumyadeb Chowdhury, Md Sadek Ferdous, and Joemon M Jose. 2016. Exploring lifelog sharing and privacy. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct (UbiComp ’16), 553-558. September 2016.
  23. M. Kamar, C. Evans, S. Hugh-JonesOP58 Factors influencing adolescent whole grain intake: In-depth interviews with adolescents using SenseCam technology
  24. Peng Wang. Investigating factorizations in everyday activity recognition. In Proceedings of SPIE, volume 10033, Eighth International Conference on Digital Image Processing (ICDIP 2016). August 2016.
  25. Peng Wang, Lifeng Sun, Shiqiang Yang, Alan F. Smeaton, and Cathal Gurrin. 2016. Characterizing everyday activities from visual lifelogs based on enhancing concept representationComputer Vision and Image Understanding 148, C, pp. 181-192. July 2016.
  26. Stephanie Gauthier. Investigating the probability of behavioural responses to cold thermal discomfort. Energy and Buildings
    Volume 124, pp. 70-78, Elsevier. July 2016.
  27. Cathal Gurrin, Hideo Joho, Frank Hopfgartner, Liting Zhou and Rami Albatal. 2016. NTCIR Lifelog: The First Test Collection for Lifelog Research. In Proceedings of the 39th International ACM SIGIR conference on Research and Development in Information Retrieval (SIGIR ’16), pp. 705-708. July 2016.
  28. Gustavo Cuberos-Urbano, Alfonso Caracuel, Carlos Valls-Serrano, Leticia García-Mochón, Fergus Gracey & Antonio Verdejo-García. A pilot investigation of the potential for incorporating lifelog technology into executive function rehabilitation for enhanced transfer of self-regulation skills to everyday life. Neuropsychological Rehabilitation . June 2016.
  29. Paulina Piasek, Kate Irving, Alan F Smeaton, Exploring boundaries to the benefits of lifelogging for identity maintenance for people with dementia, In Psychology and Mental Health: Concepts, Methodologies, Tools, and Applications: Concepts, Methodologies, Tools, and Applications, Information Resources Management Association. May 2016.
  30. Kristin Meseck, Marta M. Jankowska, Jasper Schipperijn, Loki Natarajan, Suneeta Godbole, Jordan Carlson, Michelle Takemoto, Katie Crist and Jacqueline Kerr. Is missing geographic positioning system data in accelerometry studies a problem, and is imputation the solution? Geospatial health, 11(2):403. May 2016.
  31. Katherine Ellis, Jacqueline Kerr, Suneeta Godbole, John Staudenmayer and Gert Lanckriet. Hip and Wrist Accelerometer Algorithms for Free-Living Behavior ClassificationMedicine and science in sports and exercise 48(5) pp. 933–940. May 2016.
  32. Jacqueline Kerr, Ruth E. Patterson, Katherine Ellis, Suneeta Godbole, Eileen Johnson, Gert Lanckriet and John Staudenmayer. Objective Assessment of Physical Activity: Classifiers for Public HealthMedicine and science in sports and exercise, 48(5):951-957. May 2016.
  33. Abby C. King, Sandra J. Winter, Jylana L. Sheats, Lisa G. Rosas, Matthew P. Buman, Deborah Salvo, Nicole M. Salvo, Rebecca A. Seguin, Mika Moran, Randi Moran, Bonnie Broderick, Susan G. Broderick, Olga Lucia Broderick, Silvia A. Broderick, Ann Broderick and Juan Rivera Dommarco. Leveraging Citizen Science and Information Technology for Population Physical Activity Promotion. Translational Journal of the American College of Sports Medicine: Volume 1, Issue 4, pp. 30–44. May 2016.
  34. L. Gemming, C. Ni Mhurchu. Dietary under-reporting: what foods and which meals are typically under-reported? European Journal of Clinical Nutrition, 70(5):640-1. May 2016.
  35. Louise Signal, Moira Smith, Michelle Barr, Tim Chambers, James Stanley, Jiang Zhou, Aaron Duane, Gabrielle Jenkin, Tolotea Lanumata, Amber Pearson, Cathal Gurrin, Alan Smeaton, Janet Hoek and Cliona Ni Mhurchu. Saturated with Junk: The extent of Junk Food Marketing in Children’s Everyday Lives. Obesity Reviews (Special Issue: Abstracts of the 13th International Congress on Obesity), volume 17 issue suppplement S2, pp. 152-198. April 2016.
  36. Belinda Lowe, Moira Smith, Richard Jaine, Michelle Barr, Tim Chambers, James Stanley, Jiang Zhou, Aaron Duane, Cathal Gurrin, Alan Smeaton, Cliona Ni Mhurchu and Louise Signal. Watching the watchers: quantifying the duration and nature of children’s after-school screen time. Obesity Reviews (Special Issue: Abstracts of the 13th International Congress on Obesity), volume 17 issue suppplement S2, pp. 152-198. April 2016.
  37. Morgan Harvey, Marc Langheinrich, Geoff Ward. Remembering through lifelogging: A survey of human memory augmentation. Pervasive and Mobile Computing, volume 27, pp. 14-26. April 2016.
  38. V. Kuchelmeister and J. Bennett. The Amnesia Atlas VR. A photographic media interface as memory-prosthesis2016 IEEE Virtual Reality (VR), pp. 330-330. March 2016.
  39. Juliet A. Harvey, Dawn A. Skelton and Sebastien F. M. Chastin. Acceptability of novel lifelogging technology to determine context of
    sedentary behaviour in older adults. AIMS Public Health, 3 (1), pp. 158-171. March 2016.
  40. Na Li, Martin Crane, Cathal Gurrin, and Heather J. Ruskin. Finding Motifs in Large Personal Lifelogs. In Proceedings of the 7th Augmented Human International Conference 2016 (AH ’16). ACM, New York, NY, USA, , Article 9 , 8 pages. February 2016.
  41. A. R. Silva, M. S. Pinho, L. Macedo & C. J. A. Moulin. A critical review of the effects of wearable cameras on memory. Neuropsychological Rehabilitation Volume 28, Issue 1. January 2016.
  42. Gill Cowburn, Anne Matthews, Aiden Doherty, Alex Hamilton, Paul Kelly, Julianne Williams, Charlie Foster and Michael Nelson. Exploring the opportunities for food and drink purchasing and consumption by teenagers during their journeys between home and school: a feasibility study using a novel method. Public health nutrition, volume 19, issue 1, pp. 93-103, Cambridge University Press. January 2016.
  43. Paul Kelly, Emma Thomas, Aiden Doherty, Teresa Harms, Órlaith Burke, Jonathan Gershuny, Charlie Foster. Developing a Method to Test the Validity of 24 Hour Time Use Diaries Using Wearable Cameras: A Feasibility Pilot.  PLoS ONE10(12): e0142198. December 2015.
  44. Ana Rita Silva, Maria Salome Pinho, Luis Macedo, Horacio Firmino and Christopher Moulin. Using SenseCam to stimulate cognitive function and decrease depressive symptoms in mild Alzheimer disease. International Psychogeriatrics, volume 27, Cambridge University Press. December 2015.
  45. Luke Gemming, Aiden Doherty, Jennifer Utter, Emma Shields, Cliona Ni Mhurchu. The use of a wearable camera to capture and categorise the environmental and social context of self-identified eating episodes. Appetite, Volume 92, 1 September 2015, Pages 118-125. September 2015.
  46. Paulina Piasek. Case studies in therapeutic SenseCam use aimed at identity maintenance in early stage dementia. PhD Thesis, Dublin City University. September 2015.
  47. Alexandra L. Young, Anya Skatova, Benjamin Bedwell, Tom Rodden, and Victoria Shipp. 2015. The Role of Accidental Self-Reflection in Wearable Camera Research. In Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct (MobileHCI ’15). ACM, New York, NY, USA, 1062-1065. August 2015.
  48. Youngdeok Kim, Vaughn W. Barry & Minsoo Kang. Validation of the ActiGraph GT3X and activPAL Accelerometers for the Assessment of Sedentary Behavior. Measurement in Physical Education and Exercise Science Volume 19, Issue 3. August 2015.
  49. Christopher Moulin, Ana Rita Silva, Alexandra Ernst, Lydia Dubourg, Charline Cerf and Céline Souchay. Memory Improvement in clinical groups using Life-Logging Technologies. Society for Applied Research in Memory and Cognition (SARMAC) XI conference. July 2015.
  50. Soumyadeb Chowdhury, Philip J. McParlane, Md. Sadek Ferdous, and Joemon Jose. 2015. “My Day in Review”: Visually Summarising Noisy Lifelog Data. In Proceedings of the 5th ACM on International Conference on Multimedia Retrieval (ICMR ’15). ACM, New York, NY, USA, 607-610. June 2015.
  51. Siobhan O. Connor, Noel McCaffrey, Enda Whyte & Kieran Moran. The novel use of a SenseCam and accelerometer to validate training load and training information in a self-recall training diary. Journal of Sports Sciences Volume 34, Issue 4. June 2015.
  52. Camille Nebeker, Rubi Linares-Orozco and Katie Crist. A Multi-Case Study Of Research Using Mobile Imaging, Sensing And Tracking Technologies To Objectively Measure Behavior: Ethical Issues And Insights To Guide Responsible Research Practice. Journal of Research Administration, volume 46, issue 1, pp. 118-137. May 2015.
  53. Gill Cowburn, Anne Matthews, Aiden Doherty, Alex Hamilton, Paul Kelly, Julianne Williams, Charlie Foster and Michael Nelson. Exploring the opportunities for food and drink purchasing and consumption by teenagers during their journeys between home and school: a feasibility study using a novel method. Public Health Nutrition: 19(1), 93–103. April 2015.
  54. . The use of a wearable camera improves autobiographical memory in patients with Alzheimer’s disease. Memory, Vol. 23, Issue 3. April 2015.
  55. Katja C. Thoring, Roland M. Mueller, and Petra Badke-Schaub. Ethnographic Design Research With Wearable Cameras. Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA ’15). April 2015.
  56. Katja C. Thoring, Roland M. Mueller, and Petra Badke-Schaub. Ethnographic Design Research With Wearable Cameras. In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA ’15). April 2015.
  57. Louise Hopper, Paulina Piasek and Kate Irving. Ethical challenges associated with technology use with people with dementia. In: 8th International Association of Gerontology and Geriatrics European Region (IAGG-ER) Congress, Dublin, Ireland. April 2015.
  58. Luke Gemming. Image-assisted dietary assessment: Evaluating the potential of wearable cameras to enhance self-report in the 24-hour dietary recall method. PhD Thesis, University of Auckland. 2015.
  59. Stephanie Gauthier and David Shipworth. Behavioural responses to cold thermal discomfort. Building Research & Information Volume 43, Issue 3. February 2015.
  60. Luke Gemming, Elaine Rush, Ralph Maddison, Aiden Doherty, Nicholas Gant, Jennifer Utter and Cliona Ni Mhurchu. Wearable cameras can reduce dietary under-reporting: doubly labelled water validation of a camera-assisted 24 h recall. British Journal of Nutrition (2015), 113, 284–291. January 2015.
  61. Luke Gemming, Jennifer Utter and Cliona Ni Mhurchu. Image-Assisted Dietary Assessment: A Systematic Review of the Evidence. Journal of the Academy of Nutrition and Dietetics. January 2015.
  62. Stefan Terziyski, Rami Albatal and Cathal Gurrin. Fast human activity recognition in lifelogging.  In: He X., Luo S., Tao D., Xu C., Yang J., Hasan M.A. (eds) MultiMedia Modeling. MMM 2015. Lecture Notes in Computer Science, vol 8936. Springer. January 2015.
  63. Gunjan Kumar, Houssem Jerbi, Cathal Gurrin and Michael P. O’Mahony. 2014. Towards Activity Recommendation from Lifelogs. In Proceedings of the 16th International Conference on Information Integration and Web-based Applications & Services (iiWAS ’14), Maria Indrawan-Santiago, Matthias Steinbauer, Hong-Quang Nguyen, A. Min Tjoa, Ismail Khalil, and Gabriele Anderst-Kotsis (Eds.), pp. 87-96. December 2014.
  64. Cathal Gurrin, Håvard Johansen, Thomas Sødring and Dag Johansen. Digital chronofiles of life experience. In: Aiello L., McFarland D. (eds) Social Informatics. SocInfo 2014. Lecture Notes in Computer Science, vol 8852, Springer. November 2014.
  65. Paulina Piasek, Alan Smeaton and Kate Irving. Recommendations for Incorporating Lifelogging Technologies into Therapeutic Approaches for People with Dementia. Irish Journal of Medical Science, Volume 183 Supplement 7. October 2014.
  66. P. Kelly, A. Doherty, A. Mizdrak, S. Marshall, J. Kerr, A. Legge, S. Godbole, H. Badland, M. Oliver and C. Foster. High group level validity but high random error of a self-report travel diary, as assessed by wearable cameras. Journal of Transport & Health, Volume 1, Issue 3, September 2014, Pages 190-201. September 2014.
  67. Robert Templeman, Roberto Hoyle, David Crandall and Apu Kapadia. 2014. Reactive security: responding to visual stimuli from wearable cameras. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication (UbiComp ’14 Adjunct). ACM, New York, NY, USA, 1297-1306. September 2014.
  68. Paulina Piasek, Kate Irving and Alan F. SmeatonUsing lifelogging to help construct the identity of people with dementia. In: Irish Human Computer Interaction Conference 2014, DCU, Dublin, Ireland. September 2014.
  69. Cathal Gurrin, Alan F. Smeaton and Aiden R. Doherty. “LifeLogging: Personal Big Data”. Foundations and Trends in Information Retrieval: Vol. 8: No. 1, pp 1-125. http://dx.doi.org/10.1561/1500000033. June 2014.
  70. Luke Gemming, Aiden Doherty, Nick Gant, Jennifer Utter, Ralph Maddison and Cliona Ni Mhurchu. Validation Of A Wearable Camera Image-assisted 24 H Dietary Recall Against Doubly Labelled Water. Nutrition & Dietetics, volume 71, p. 25. May 2014.
  71. Catalina Spataru and Stephanie Gauthier. How to monitor people ‘smartly’ to help reducing energy consumption in buildings? Architectural Engineering and Design Management Volume 10, Issue 1-2. November 2013.
  72. Katherine Ellis, Suneeta Godbole, Jacqueline Chen, Simon Marshall, Gert Lanckriet and Jacqueline Kerr. 2013. Physical activity recognition in free-living from body-worn sensors. In Proceedings of the 4th International SenseCam & Pervasive Imaging Conference (SenseCam ’13). ACM, New York, NY, USA, 88-89. November 2013.
  73. Michael S. Lam, Suneeta Godbole, Jacqueline Chen, Melody Oliver, Hannah Badland, Simon J. Marshall, Paul Kelly, Charlie Foster, Aiden Doherty and Jacqueline Kerr. 2013. Measuring time spent outdoors using a wearable camera and GPS. In Proceedings of the 4th International SenseCam & Pervasive Imaging Conference (SenseCam ’13). ACM, New York, NY, USA, 1-7. November 2013.
  74. Aiden Doherty, Wilby Williamson, Melvyn Hillsdon, Steve Hodges, Charlie Foster and Paul Kelly. 2013. Influencing health-related behaviour with wearable cameras: strategies & ethical considerations. In Proceedings of the 4th International SenseCam & Pervasive Imaging Conference (SenseCam ’13). ACM, New York, NY, USA, 60-67. November 2013.
  75. Jylana L. Sheats, Sandra J. Winter, Priscilla Padilla-Romero, Lisa Goldman-Rosas, Lauren A. Grieco and Abby C. King. 2013. Comparison of passive versus active photo capture of built environment features by technology naïve Latinos using the SenseCam and Stanford healthy neighborhood discovery tool. In Proceedings of the 4th International SenseCam & Pervasive Imaging Conference (SenseCam ’13). ACM, New York, NY, USA, 8-15. November 2013.
  76. Cathal Gurrin, Alan F. Smeaton, Zhengwei Qiu and Aiden Doherty. 2013. Exploring the technical challenges of large-scale lifelogging. In Proceedings of the 4th International SenseCam & Pervasive Imaging Conference (SenseCam ’13). ACM, New York, NY, USA, 68-75. November 2013.
  77. Basel Kikhia and Josef Hallberg. 2013. Visualizing and managing stress through colors and images. In Proceedings of the 4th International SenseCam & Pervasive Imaging Conference (SenseCam ’13). ACM, New York, NY, USA, 78-79. November 2013.
  78. Lijuan Marissa Zhou and Cathal Gurrin. 2013. MemoryMesh: lifelogs as densely linked hypermedia. In Proceedings of the 4th International SenseCam & Pervasive Imaging Conference (SenseCam ’13). ACM, New York, NY, USA, 90-91. November 2013.
  79. Niamh Caprani, Noel E. O’Connor and Cathal Gurrin. 2013. Experiencing SenseCam: a case study interview exploring seven years living with a wearable camera. In Proceedings of the 4th International SenseCam & Pervasive Imaging Conference (SenseCam ’13). ACM, New York, NY, USA, 52-59. November 2013.
  80. Gemma Wilson, Derek Jones, Patricia Schofield and Denis Martin. 2013. The use of the Sensecam to explore daily functioning of older adults with chronic pain. In Proceedings of the 4th International SenseCam & Pervasive Imaging Conference (SenseCam ’13). ACM, New York, NY, USA, 76-77. November 2013.
  81. Jacqueline Chen, Simon J. Marshall, Lu Wang, Suneeta Godbole, Amanda Legge, Aiden Doherty, Paul Kelly, Melody Oliver, Ruth Patterson, Charlie Foster and Jacqueline Kerr. 2013. Using the SenseCam as an objective tool for evaluating eating patterns. In Proceedings of the 4th International SenseCam & Pervasive Imaging Conference (SenseCam ’13). ACM, New York, NY, USA, 34-41. November 2013.
  82. Michelle Barr, Louise Signal, Gabrielle Jenkin and Moira Smith. 2013. Using SenseCam to capture children’s exposure to food marketing: a feasibility study. In Proceedings of the 4th International SenseCam & Pervasive Imaging Conference (SenseCam ’13). ACM, New York, NY, USA, 50-51. November 2013.
  83. Catherine Marinac, Gina Merchant, Suneeta Godbole, Jacqueline Chen, Jacqueline Kerr, Bronwyn Clark and Simon Marshall. 2013. The feasibility of using SenseCams to measure the type and context of daily sedentary behaviors. In Proceedings of the 4th International SenseCam & Pervasive Imaging Conference (SenseCam ’13). ACM, New York, NY, USA, 42-49. November 2013.
  84. Yang Yang and Cathal Gurrin. 2013. Personal lifelog visualization. In Proceedings of the 4th International SenseCam & Pervasive Imaging Conference (SenseCam ’13). ACM, New York, NY, USA, 82-83. November 2013.
  85. Sabrina Agnihotri, Joanne Rovet, Deb Cameron, Carmen Rasmussen, Jennifer Ryan and Michelle Keightley. 2013. SenseCam as an everyday memory rehabilitation tool for youth with fetal alcohol spectrum disorder. In Proceedings of the 4th International SenseCam & Pervasive Imaging Conference (SenseCam ’13). ACM, New York, NY, USA, 86-87. November 2013.
  86. Suzanne Mavoa, Melody Oliver, Jacqueline Kerr, Aiden Doherty and Karen Witten. 2013. Using SenseCam images to assess the environment. In Proceedings of the 4th International SenseCam & Pervasive Imaging Conference (SenseCam ’13). ACM, New York, NY, USA, 84-85. November 2013.
  87. John G. Seamon, Tacie N. Moskowitz, Ashley E. Swan, Boyuan Zhong, Amy Golembeski, Christopher Liong, Alexa C. Narzikul and Olumide A. Sosan. SenseCam reminiscence and action recall in memory-unimpaired people. Memory, Volume 22, Issue 7. October 2013.
  88. Jenny Svanberg and Jonathan J. Evans. Impact of SenseCam on memory, identity and mood in Korsakoff’s syndrome: A single case experimental design study. Neuropsychological Rehabilitation Volume 24, Issue 3-4. October 2013.
  89. L. Gemming, A. Doherty, P. Kelly, J. Utter and C. Ni Mhurchu. Feasibility of a SenseCam-assisted 24-h recall to reduce under-reporting of energy intake. European Journal of Clinical Nutrition. European Journal of Clinical Nutrition 67, 1095–1099 (2013). September 2013.
  90. Niamh Caprani, Noel E O’Connor and Cathal Gurrin. Investigating older and younger peoples’ motivations for lifelogging with wearable cameras2013 IEEE International Symposium on Technology and Society (ISTAS): Social Implications of Wearable Computing and Augmediated Reality in Everyday Life, Toronto, ON, pp. 32-41. June 2013.
  91. Corina Sas, Tomasz Fratczak, Matthew Rees, Hans Gellersen, Vaiva Kalnikaite, Alina Coman and Kristina Höök. 2013. AffectCam: arousal- augmented sensecam for richer recall of episodic memories. In CHI ’13 Extended Abstracts on Human Factors in Computing Systems (CHI EA ’13). ACM, New York, NY, USA, 1041-1046. May 2013.
  92. Melody Oliver, Aiden R. Doherty, Paul Kelly, Hannah M. Badland, Suzanne Mavoa, Janine Shepherd, Jacqueline Kerr, Simon Marshall, Alexander Hamilton and Charlie Foster. Utility of passive photography to objectively audit built environment features of active transport journeys: an observational study. International Journal of Health Geographics, 2013, 12:20. April 2013.
  93. Aiden R. Doherty, Steve E. Hodges, Abby C. King, Alan F. Smeaton, Emma Berry, Chris J.A. Moulin, Siân Lindley, Paul Kelly and Charlie Foster. Wearable Cameras in Health. American Journal of Preventive Medicine, Volume 44, Issue 3, Pages 320-323. March 2013.
  94. Paul Kelly, Simon J. Marshall, Hannah Badland, Jacqueline Kerr, Melody Oliver, Aiden R. Doherty and Charlie Foster. An Ethical Framework for Automated, Wearable Cameras in Health Behavior Research. American Journal of Preventive Medicine, Volume 44, Issue 3, Pages 314-319. March 2013.
  95. Cathal Gurrin, Zhengwei Qiu, Mark Hughes, Niamh Caprani, Aiden R. Doherty, Steve E. Hodges and Alan F. Smeaton. The Smartphone As a Platform for Wearable Cameras in Health Research. American Journal of Preventive Medicine, Volume 44, Issue 3, Pages 308-313. March 2013.
  96. Ana R. Silva, Salomé Pinho, Luís M. Macedo and Chris J. Moulin. Benefits of SenseCam Review on Neuropsychological Test Performance. American Journal of Preventive Medicine, Volume 44, Issue 3, Pages 302–307. March 2013.
  97. Gillian O’Loughlin, Sarah Jane Cullen, Adrian McGoldrick, Siobhan O’Connor, Richard Blain, Shane O’Malley and Giles D. Warrington. Using a Wearable Camera to Increase the Accuracy of Dietary Analysis. American Journal of Preventive Medicine, Volume 44, Issue 3, Pages 297–301. March 2013.
  98. Jacqueline Kerr, Simon J. Marshall, Suneeta Godbole, Jacqueline Chen, Amanda Legge, Aiden R. Doherty, Paul Kelly, Melody Oliver, Hannah M. Badland, Charlie Foster. Using the SenseCam to Improve Classifications of Sedentary Behavior in Free-Living Settings. American Journal of Preventive Medicine, Volume 44, Issue 3, Pages 290–296. March 2013.
  99. Jacqueline Kerr, Katherine Ellis, Gert Lanckriet, John Staudenmayer, Suneeta Godbole, Jacqueline Chen, Paul Kelly, Aiden R. Doherty, Charlie Foster, Melody Oliver, Hannah Badland and Simon Marshall. “Will the real breaks please stand up!” Detecting breaks from sitting using objective tools and novel analytic techniques. Society of Behavioral Medicine. March 2013
  100. Aiden R Doherty, Paul Kelly, Jacqueline Kerr, Simon Marshall, Melody Oliver, Hannah Badland, Alexander Hamilton and Charlie Foster. Using wearable cameras to categorise type and context of accelerometer-identified episodes of physical activity. International Journal of Behavioral Nutrition and Physical Activity 10:22. https://doi.org/10.1186/1479-5868-10-22 February 2013.
  101. Aiden Doherty, Paul Kelly and Charlie Foster. Wearable cameras: identifying healthy transportation choices. IEEE Pervasive Computing, volume 12 issue 1 pp. 44-47. January 2013.
  102. Paul Kelly. Assessing the utility of wearable cameras in the measurement of walking and cycling. DPhil Thesis, University of Oxford. 2013.
  103. A. Doherty, S. Marshall, P. Kelly, A. Hamilton, M. Oliver, H. Badland, J. Kerr and C. Foster. Identifying sedentary behaviour types using SenseCam: A pilot study. Journal of Science and Medicine in Sport, volume 15, pp. 296-297. December 2012.
  104. A. Doherty, P. Kelly, M. Oliver, A. Hamilton, H. Badland, S. Marshall, J. Kerr and C. Foster. Using SenseCam to categorise type and context of accelerometer-identified episodes. Journal of Science and Medicine in Sport, volume 15 pp. 92-93. December 2012.
  105. Paul Kelly, Aiden R Doherty, Alex Hamilton, Anne Matthews, Alan M Batterham, Michael Nelson, Charlie Foster and Gill Cowburn. Evaluating the feasibility of measuring travel to school using a wearable camera. American Journal of Preventive Medicine, volume 43 issue 5, pp. 546-550. November 2012.
  106. Aiden R Doherty, Paul Kelly, Jacqueline Kerr, Simon Marshall, Melody Oliver, Hannah Badland and Charlie Foster. Use of wearable cameras to assess population physical activity behaviours: an observational study. The Lancet, volume 380. November 2012.
  107. Gillian R. Hayes and Khai N. Truong. Paratyping: A Contextualized Method of Inquiry for Understanding Perceptions of Mobile and Ubiquitous Computing Technologies. Human–Computer Interaction Volume 28, Issue 3. June 2012.
  108. Aiden R. Doherty, Katalin Pauly-Takacs, Niamh Caprani, Cathal Gurrin, Chris J. A. Moulin, Noel E. O’Connor and Alan F. Smeaton. Experiences of Aiding Autobiographical Memory Using the SenseCam. Human–Computer Interaction Volume 27, Issue 1-2. April 2012.
  109. Masashi Crete-Nishihata, Ronald M. Baecker, Michael Massimi, Deborah Ptak, Rachelle Campigotto, Liam D. Kaufman, Adam M. Brickman, Gary R. Turner, Joshua R. Steinerman and Sandra E. Black. Reconstructing the Past: Personal Memory Technologies Are Not Just Personal and Not Just for Memory. Human–Computer Interaction Volume 27, Issue 1-2. April 2012.
  110. Steve Whittaker, Vaiva Kalnikaitė, Daniela Petrelli, Abigail Sellen, Nicolas Villar, Ofer Bergman, Paul Clough and Jens Brockmeier. Socio-Technical Lifelogging: Deriving Design Principles for a Future Proof Digital Past. Human–Computer Interaction Volume 27, Issue 1-2. April 2012.
  111. Paulina Piasek, Kate Irving and Alan F. Smeaton. Case study in SenseCam use as an intervention technology for early-stage dementia. International Journal of Computers in Healthcare, Volume 1 Number 4, pp. 304-319. January 2012.
  112. Daragh Byrne. Digital Life Stories: Semi-Automatic (Auto)Biographies within Lifelog Collections. PhD thesis, Dublin City University. January 2012.
  113. . SenseCam: A wearable camera that stimulates and rehabilitates autobiographical memory. Memory, Vol. 19, Issue 7. October 2011.
  114. . SenseCam improves memory for recent events and quality of life in a patient with memory retrieval difficulties. Memory, Vol. 19, Issue 7. October 2011.
  115. C. Loveday and M.A. Conway. Using SenseCam with an Amnesic Patient: Accessing Inaccessible Everyday Memories. Memory, Vol. 19, Issue 7. October 2011.
  116. Fionnuala C. Murphy, Philip J. Barnard, Kayleigh A. M. Terry, Maria Teresa Carthery-Goulart and Emily A. Holmes. SenseCam, imagery and bias in memory for wellbeing. Memory, Vol. 19, Issue 7. October 2011.
  117. F. Milton, N. Muhlert, C. R. Butler, A. Smith, A. Benattayallah and A. Z. Zeman. An fMRI study of long-term everyday memory using SenseCam. Memory, Vol. 19, Issue 7. October 2011.
  118. Kiernan Burke, Sue Franklin and Olive Gowan. Passive imaging technology in aphasia therapy. Memory, Vol. 19, Issue 7. October 2011.
  119. Jason R. Finley, William F. Brewer and Aaron S. Benjamin. The effects of end-of-day picture review and a sensor-based picture capture procedure on autobiographical memory using SenseCam. Memory, Vol. 19, Issue 7. October 2011.
  120. Philip J. Barnard, Fionnuala C. Murphy, Maria Teresa Carthery-Goulart, Cristina Ramponi and Linda Clare. Exploring the basis and boundary conditions of SenseCam-facilitated recollection. Memory, Vol. 19, Issue 7. October 2011.
  121. Peggy L. St. Jacques, Martin A. Conway and Roberto Cabeza. Gender differences in autobiographical memory for everyday events: Retrieval elicited by SenseCam images versus verbal cues. Memory, Vol. 19, Issue 7. October 2011.
  122. Aiden R. Doherty, Chris J. A. Moulin and Alan F. Smeaton. Automatically assisting human memory: A SenseCam browser. Memory, Vol. 19, Issue 7. October 2011.
  123. Katalin Pauly-Takacs, Chris J. A. Moulin and Edward J. Estlin. SenseCam as a rehabilitation tool in a child with anterograde amnesia. Memory, Vol. 19, Issue 7. October 2011.
  124. Rob Brindley, Andrew Bateman and Fergus Gracey. Exploration of use of SenseCam to support autobiographical memory retrieval within a cognitive-behavioural therapeutic intervention following acquired brain injury. Memory, Vol. 19, Issue 7. October 2011.
  125. Liadh Kelly. Context Driven Retrieval Algorithms for Semi-Structured Personal Lifelogs. PhD Thesis, Dublin City University. September 2011.
  126. Doherty, Aiden R. and Caprani, Niamh and O Conaire, Ciaran and Kalnikaite, Vaiva and Gurrin, Cathal and O’Connor, Noel and Smeaton, Alan F. Passively recognising human activities through lifelogging. Computers in Human Behavior, 27 (5). pp. 1948-1958. ISSN 0747-5632. September 2011.
  127. Qiu, Zhengwei and Doherty, Aiden R. and Gurrin, Cathal and Smeaton, Alan F. Mining user activity as a context source for search and retrieval. In: STAIR’11: International Conference on Semantic Technology and Information Retrieval, 28-29 June 2011, Kuala Lumpur, Malaysia.
  128. P.L. St Jacques, M.A. Conway, M.W. Lowder and R. Cabeza. Watching my mind unfold versus yours: an fMRI study using a novel camera technology to examine neural differences in self-projection of self versus other perspectives. Journal of Cognitive Neuroscience 2011; 23(6):1275-84. June 2011.
  129. P. Piasek, K. Irving and A. F. Smeaton, SenseCam intervention based on Cognitive Stimulation Therapy framework for early-stage dementia5th International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth) and Workshops, Dublin, pp. 522-525. May 2011.
  130. Paul Kelly, Aiden Doherty, Emma Berry, Steve Hodges, Alan M Batterham and Charlie Foster. Can we use digital life-log images to investigate active and sedentary travel behaviour? Results from a pilot study. International Journal of Behavioral Nutrition and Physical Activity. May 2011.
  131. Siân Lindley, Maxine Glancy, Richard Harper, Dave Randall and Nicola Smyth. “Oh and how things just don’t change, the more things stay the same”: Reflections on SenseCam images 18 months after capture. International Journal of Human-Computer Studies, Volume 69, Issue 5, Pages 311-323. May 2011.
  132. Byrne, Daragh and Kelliher, Aisling and Jones, Gareth J.F. Life editing: Third-party perspectives on lifelog content. In: The ACM CHI Conference on Human Factors in Computing Systems (CHI 2011), May 2011, Vancouver, Canada.
  133. Aiden R. Doherty, Kieron O’Hara, Kiyoharu Aizawa, Niamh Caprani and Alan F. Smeaton (Eds.), Proceedings of the 2nd Workshop on Information Access to Personal Media Archives. 18 April 2011, Dublin, Ireland, ISBN 1872-327-974. April 2011.
  134. Niamh Caprani, Noel E. O’Connor, and Cathal Gurrin. 2011. Considerations for a touchscreen visual lifelog. In Proceedings of the 1st ACM International Conference on Multimedia Retrieval (ICMR ’11). ACM, New York, NY, USA, Article 67, 2 pages. April 2011.
  135. Chen, Yi and Jones, Gareth J.F. and Ganguly, Debasis Segmenting and summarizing general events in a long-term lifelog. In: The 2nd Workshop Information Access for Personal Media Archives (IAPMA) at ECIR 2011, 18-21 April 2011, Dublin, Ireland.
  136. Sutton, Jon. Claire’s life, 9:53-10:42. The Psychologist. February 2011.
  137. Nebojsa Jojic, Alessandro Perina, Vittorio Murino. Structural epitome: a way to summarize one’s visual experience. In Proceedings of the Twenty-fourth Annual Conference on Neural Information Processing Systems (NIPS). December 2010.
  138. Jones, Gareth J.F. and Byrne, Daragh and Hughes, Mark and O’Connor, Noel E. and Salway, Andrew. Automated annotation of landmark images using community contributed datasets and web resources. In: The 5th International Conference on Semantic and Digital Media Technologies (SAMT 2010), Saarbrücken, Germany. December 2010.
  139. Rowanne Fleck, Geraldine Fitzpatrick, Reflecting on reflection: framing a design landscape. In Proceedings of the 22nd Conference of the Computer-Human Interaction Special Interest Group of Australia on Computer-Human Interaction (OZCHI ’10). ACM, New York, NY, USA, 216-223. November 2010.
  140. St. Jacques PL, Conway MA, Lowder M, Cabeza R. Watching my mind unfold vs. yours: An fMRI study using a novel camera technology to examine the neural correlates of self projection of self vs. other. Journal of Cognitive Neuroscience, 23, 1275-1284. doi:10.1162/jocn. 2010. 21518.
  141. Eben Harrell, Remains of the Day. Time Magazine, October 2010.
  142. Philip Kelly, Aiden R. Doherty, Alan F. Smeaton, Cathal Gurrin, and Noel E. O’Connor. 2010. The colour of life: novel visualisations of population lifestyles. In Proceedings of the 18th ACM international conference on Multimedia (MM ’10). ACM, New York, NY, USA, 1063-1066. October 2010.
  143. Doherty, Aiden R. and Qiu, Zhengwei and Foley, Colum and Lee, Hyowon and Gurrin, Cathal and Smeaton, Alan F. Green multimedia: informing people of their carbon footprint through two simple sensors. In: ACM Multimedia 2010, 25-29 October 2010, Florence, Italy. ISBN 978-1-60558-933-6
  144. Caprani N, Gurrin, C and O’Connor N.E. I Like to Log: A Questionnaire Study towards Accessible Lifelogging for Older Users. In ASSETS 2010: The 12th International ACM SIGACCESS Conference on Computers and Accessibility, Orlando, Fl, 25-27 October, 2010.
  145. Emma Berry, Daragh Byrne, Aiden R. Doherty, Cathal Gurrin, Alan F. Smeaton. Proceedings of the second annual SenseCam symposium (SenseCam 2010). Dublin City University, Dublin, Ireland, ISBN 1872-327-915. September 2010.
  146. Chen, Yi and Kelly, Liadh and Jones, Gareth J.F. Supporting episodic memory from personal lifelog archives using SenseCam and contextual cues. In Proceedings of the second annual SenseCam symposium (SenseCam 2010), Dublin, Ireland. September 2010.
  147. Kelly, Philip and Kumar, Anil and Doherty, Aiden R. and Lee, Hyowon and Smeaton, Alan F. and Gurrin, Cathal and O’Connor, Noel E. The colour of life: interacting with SenseCam images on large multi-touch display walls.  In Proceedings of the second annual SenseCam symposium (SenseCam 2010), Dublin, Ireland. September 2010.
  148. Kelly L and Jones G.J.F. An Exploration of the Utility of GSR in Locating Events from Personal Lifelogs for Reflection. In Proceedings of the 4th Irish Human Computer Interaction Conference (iHCI2010), Dublin, Ireland, 2-3 September 2010.
  149. Chen Y and Jones G.J.F. Augmenting Human Memory using Personal Lifelogs. In Proceedings of the First Augmented Human International Conference (AH’10), Megève, France, April 2010.
  150. Byrne D, Kelly L and Jones G.J.F. Multiple Multimodal Mobile Devices: Lessons Learned from Engineering Lifelog Solutions. In: Handbook of Research on Mobile Software Engineering: Design, Implementation and Emergent Applications, IGI Publishing, 2010.
  151. Kelly, P and Foster, C. A new technology for measuring our journeys: Results from a pilot study. Early Career Investigator Prize. 2010 Annual Conference of the International Society of Behavioral Nutrition and Physical Acitivity, Minneapolis, 9-12 June, 2010.
  152. Marcu, G. and Hayes, G. R. Use of a Wearable Recording Device in Therapeutic Interventions for Children with Autism. In Proc Workshop on Interactive Systems in Healthcare (WISH). Atlanta, GA. Apr 10-11, 2010. pp 113-116. Dealer Analysis Group, ISBN 098262848X. April 2010.
  153. K. Pauly-Takacs, C.J.A. Moulin, E.J. Estlin and M.A. Conway. Supporting personal semantic memories with Sensecam: a child case study. Developmental Medicine & Child Neurology, volume 52, pp. 13-14. April 2010.
  154. Caprani, N., Doherty, A., Lee, H., Smeaton, A.F., O’Connor, N. and Gurrin, C. Designing a Touch-Screen SenseCam Browser to Support an Aging Population. CHI 2010 – ACM Conference on Human Factors in Computing Systems (Work-in-Progress), Atlanta, GA, 10-15 April 2010.
  155. Vaiva Kalnikaite, Abigail Sellen, Steve Whittaker, and David Kirk. Now let me see where I was: understanding how lifelogs mediate memory. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’10). ACM, New York, NY, USA, 2045-2054. April 2010.
  156. Siân Lindley, Eduardo H. Calvillo Gámez, Juan José Gámez Leija. Remembering Rituals of Remembrance:  Capturing Xantolo through SenseCam. CHI 2010 workshop on HCI at the End of Life. April 2010.
  157. Doherty, A.R. and Smeaton, A.F. Automatically augmenting lifelog events using pervasively generated content from millions of people. Sensors, 10 (3). pp. 1423-1446, 2010.
  158. Byrne, D. and Jones, G.J.F. Exploring narrative presentation for large multimodal lifelog collections through card sorting. ICIDS 2009 – Second International Conference on Interactive Digital Storytelling, Guimarães, Portugal, 9-11 December 2009.
  159. Fleck, R. & Fitzpatrick, G., Teachers’ and Tutors’ Social Reflection around SenseCam Images. International Journal of Human Computer Studies 67, pp. 1027-1036. December 2009.
  160. Kelly, L., Byrne, D. and Jones, G.J.F. The role of places and spaces in lifelog retrieval. PIM 2009 – Personal Information Management, Vancouver, Canada, 7-8 November, 2009.
  161. Fleck, R. & Fitzpatrick, G., Teachers’ and Tutors’ Social Reflection around SenseCam Images. International Journal of Human Computer Studies 67, pp. 1027-1036. December 2009.
  162. Doherty, Aiden R. and Gurrin, Cathal and Smeaton, Alan F. An investigation into event decay from large personal media archives. In: EIMM 2009 – 1st ACM International Workshop on Events in Multimedia, 23 October 2009, Beijing, China. October 2009.
  163. Aiden R. Doherty and Cathal Gurrin and Alan F. Smeaton. Utilising contextual memory retrieval cues and the ubiquity of the cell phone to review lifelogged physiological activities. In: EIMM 2009 – 1st ACM International Workshop on Events in Multimedia, pp. 49-46, Beijing, China. October 2009.
  164. David H. Nguyen, Gabriela Marcu, Gillian R. Hayes, Khai N. Truong, James Scott, Marc Langheinrich, and Christof Roduner. Encountering SenseCam: personal recording technologies in everyday life. In Proceedings of the 11th international conference on Ubiquitous computing (UbiComp ’09). ACM, New York, NY, USA, 165-174. September 2009.
  165. Siân E. Lindley, Richard Harper, Dave Randall, Maxine Glancy, and Nicola Smyth. Fixed in time and “time in motion”: mobility of vision through a SenseCam lens. In Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI ’09). ACM, New York, NY, USA, , Article 2 , 10 pages. September 2009.
  166. Kelly L and Jones G.J.F. Examining the Utility of Affective Response in Search of Personal Lifelogs. 5th Workshop on Emotion in HCI, British HCI Conference 2009, Cambridge, U.K., 1 September 2009.
  167. Siân E. Lindley, Dave Randall, Wes Sharrock, Maxine Glancy, Nicola Smyth, and Richard Harper. Narrative, memory and practice: tensions and choices in the use of a digital artefact. In Proceedings of the 23rd British HCI Group Annual Conference on People and Computers: Celebrating People and Technology (BCS-HCI ’09). British Computer Society, Swindon, UK, UK, 1-9. September 2009.
  168. Chen Y and Jones G. An Event-Based Interface to Support Personal Lifelog Search. HCI International 2009 – 13th International Conference on Human-Computer Interaction, San Diego, CA, 19-24 July 2009.
  169. Daragh Byrne, Aiden R. Doherty, Cees G.M. Snoek, Gareth J.F. Jones, Alan F. Smeaton. Everyday Concept Detection in Visual Lifelogs: Validation, Relationships and Trends, Multimedia Tools and Applications, ISSN 1573-7721. July 2009.
  170. Fleck, R Supporting Reflection on Experience with SenseCam, Workshop on Designing for Reflection on Experience, CHI2009, Boston. April 2009.
  171. Daragh Byrne and Gareth J.F. Jones. Creating stories for reflection from multimodal lifelog content: an initial investigation.  In: Designing for Reflection on Experience Workshop at CHI 2009, Boston. April 2009.
  172. Sara Ljungblad. 2009. Passive photography from a creative perspective: “If I would just shoot the same thing for seven days, it’s like… What’s the point?”. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’09). ACM, New York, NY, USA, 829-838. April 2009.
  173. Laursen L. NEUROSCIENCE: A Memorable Device. Science 13 March 2009: 1422-1423.
  174. Emma L Berry, Adam Hampshire, James Rowe, Steve Hodges, Narinder Kapur, Peter Watson, Georgina Browne, Gavin Smyth, Ken Wood and Adrian M. Owen. The neural basis of effective memory therapy in a patient with limbic encephalitis. Journal of Neurology, Neurosurgery and Psychiatry, Volume 80, Issue 11. March 2009.
  175. O Conaire C, Blighe M and O’Connor N. SenseCam Image Localisation using Hierarchical SURF Trees. MMM 2009 – 15th international Multimedia Modeling Conference, Sophia-Antipolis, France, 7-9 January 2009.
  176. Aiden R. Doherty and Alan F. Smeaton. Utilising Wearable Sensor Technology to Provide Effective Memory Cues. European Research Consortium for Informatics and Mathematics ERCIM NEWS 76 January 2009.
  177. Kumpulainen S, Jarvelin K, Serola S, Doherty A.R, Byrne D, Smeaton A.F, and Jones G. Data Collection Methods for Analyzing Task-Based Information Access in Molecular Medicine. MobiHealthInf 2009 – 1st International Workshop on Mobilizing Health Information to Support Healthcare-related Knowledge Work, Porto, Portugal, 16 January 2009.
  178. Blighe, Michael Organising and structuring a visual diary using visual interest point detectors. PhD thesis, Dublin City University. December 2008.
  179. Doherty, Aiden R. Providing effective memory retrieval cues through automatic structuring and augmentation of a lifelog of images. PhD thesis, Dublin City University. December 2008.
  180. Lee H, Smeaton A.F, O’Connor N, Jones G, Blighe M, Byrne D, Doherty A.R, and Gurrin G. Constructing a SenseCam Visual Diary as a Media Process. Multimedia Systems Journal, Special Issue on Canonical Processes of Media Production, Volume 14, Issue 6, pp 341–349. December 2008.
  181. Gareth J.F. Jones, Cathal Gurrin, Liadh Kelly, Daragh Byrne, and Yi Chen. Information access tasks and evaluation for personal lifelogs. In: 2nd International workshop on Evaluating Information Access (EVIA), Tokyo, Japan. December 2008.
  182. Byrne D, Doherty A.R., Snoek C.G.M., Jones G.F., and Smeaton A.F. Validating the Detection of Everyday Concepts in Visual Lifelogs. SAMT 2008 – 3rd International Conference on Semantic and Digital Media Technologies, Koblenz, Germany, 3-5 December 2008.
  183. Matthew L. Lee, Anind K. Dey, Wearable experience capture for episodic memory support. pp.107-108, 2008 12th IEEE International Symposium on Wearable Computers. October 2008.
  184. Doherty A.R, Ó Conaire C, Blighe M, Smeaton A.F, and O’Connor N. Combining Image Descriptors to Effectively Retrieve Events from Visual Lifelogs. MIR 2008 – ACM International Conference on Multimedia Information Retrieval, Vancouver, Canada, 30-31 October 2008.
  185. Byrne, D. and Jones, G. J. 2008. Towards computational autobiographical narratives through human digital memories. In Proceeding of the 2nd ACM international Workshop on Story Representation, Mechanism and Context (Vancouver, British Columbia, Canada, October 31 – 31, 2008). SRMC ’08. ACM, New York.
  186. Michael Massimi, Emma Berry, Georgina Browne, Gavin Smyth, Peter Watson & Ronald M. Baecker. An exploratory case study of the impact of ambient biographical displays on identity in a patient with Alzheimer’s disease. Neuropsychological Rehabilitation Volume 18, Issue 5-6. October 2008.
  187. Bowen, M. An investigation of the therapeutic efficacy of SenseCam as an autobiographical memory aid in a patient with temporal lobe amnesia. University of Exeter MSc project. October 2008.
  188. Lee, M. L. and Dey, A. K. 2008. Lifelogging memory appliance for people with episodic memory impairment. In Proceedings of the 10th international Conference on Ubiquitous Computing, Seoul, Korea. UbiComp ’08, vol. 344. ACM, New York, NY, 44-53. September 2008.
  189. Byrne, Daragh and Lee, Hyowon and Jones, Gareth J.F. and Smeaton, Alan F. Guidelines for the presentation and visualisation of lifelog content. In: iHCI 2008 – Irish Human Computer Interaction Conference 2008, 19-20 September 2008, Cork, Ireland.
  190. Melissa Bowen, An investigation of the therapeutic efficacy of SenseCam as an autobiographical memory aid in a patient with medial temporal lobe amnesia. MSc Thesis, University of Exeter, September 2008.
  191. Byrne D, Doherty A.R, Gareth J.F. Jones, Smeaton A.F, Kumpulainen S and Jarvelin K. The SenseCam as a Tool for Task Observation. HCI 2008 – 22nd BCS HCI Group Conference, Liverpool, U.K., 1-5 September 2008.
  192. Blighe M, Doherty A.R, Smeaton A.F and O’Connor N. Keyframe Detection in Visual Lifelogs. PETRA 2008 – 1st International Conference on Pervasive Technologies Related to Assistive Environments, Athens, Greece, 15-19 July 2008.
  193. Doherty A.R. and Smeaton A.F. Combining Face Detection and Novelty to Identify Important Events in a Visual LifeLog. CIT 2008 – IEEE International Conference on Computer and Information Technology, Workshop on Image- and Video-based Pattern Analysis and Applications, Sydney, Australia, 8-11 July 2008.
  194. Doherty A.R., Byrne D, Smeaton A.F., Jones G.J.F. and Hughes M. Investigating Keyframe Selection Methods in the Novel Domain of Passively Captured Visual Lifelogs. CIVR 2008 – ACM International Conference on Image and Video Retrieval, Niagara Falls, Canada, 7-9 July 2008.
  195. Blighe M and O’Connor N. MyPlaces: Detecting Important Settings in a Visual Diary. CIVR 2008 – ACM International Conference on Image and Video Retrieval, Niagara Falls, Canada, 7-9 July 2008.
  196. Cathal Gurrin, Daragh Byrne, Noel O’Connor, Gareth J.F. Jones and Alan F. Smeaton. Architecture and challenges of maintaining a large-scale, context-aware human digital memory. 5th International Conference on Visual Information Engineering (VIE 2008), pp. 158 – 163. DOI: 10.1049/cp:20080301. July 2008.
  197. Blighe M, Sav S, Lee H, and O’Connor N. Mo Músaem Fíorúil: A Web-based Search and Information Service for Museum Visitors. ICIAR 2008 –International Conference on Image Analysis and Recognition, Povoa de Varzim, Portugal. June 2008.
  198. Puangpakisiri, W.; Yamasaki, T.;   Aizawa, K.; High level activity annotation of daily experiences by a combination of a wearable device and Wi-Fi based positioning system, IEEE International Conference on Multimedia and Expo. June 2008.
  199. Matthew L. Lee , Anind K. Dey, Using lifelogging to support recollection for people with episodic memory impairment and their caregivers. Proceedings of the 2nd International Workshop on Systems and Networking Support for Health Care and Assisted Living Environments. June 2008.
  200. Doherty A.R. and Smeaton A.F. Automatically Segmenting Lifelog Data Into Events. WIAMIS 2008 – 9th International Workshop on Image Analysis for Multimedia Interactive Services, Klagenfurt, Austria, 7-9 May 2008.
  201. Blighe M, O’Connor N, Rehatschek H and Kienast G. Identifying Different Settings in a Visual Diary. WIAMIS 2008 – 9th International Workshop on Image Analysis for Multimedia Interactive Services, Klagenfurt, Austria, 7-9 May 2008.
  202. I. Budvytis, J. Scott, A. Butler. Compass-Based Automatic Picture Taking using SenseCam. In Proceedings of the 6th International Conference on Pervasive Computing, Pervasive ’08, Sydney Australia. May 2008.
  203. Fuller M, Kelly L and Jones G. Applying Contextual Memory Cues for Retrieval from Personal Information Archives. PIM 2008 – Proceedings of Personal Information Management, Workshop at CHI 2008, Florence, Italy, 5-6 April 2008.
  204. Fleck, R. Exploring the Potential of Passive Image Capture to Support Reflection on Experience, DPhil Thesis, Department of Psychology, University of Sussex, UK. January 2008.
  205. Gurrin C, Smeaton A.F, Byrne D, O’Hare N, Jones G and O’Connor N. An Examination of a Large Visual Lifelog. AIRS 2008 – Asia Information Retrieval Symposium, Harbin, China. January 2008.
  206. Matthew L. Lee and Anind K. Dey, Providing good memory cues for people with episodic memory impairment. In Proceedings of the 9th international ACM SIGACCESS conference on Computers and accessibility (Assets ’07). ACM, New York, NY, USA, 131-138. October 2007.
  207. Deborah Barreau, Abe Crystal, Jane Greenberg, Anuj Sharma, Michael Conway, John Oberlin, Michael Shoffner and Stephen Seiberling. Augmenting Memory for Student Learning: Designing a Context-Aware Capture System for Biology Education, Proceedings of the American Society for Information Science and Technology, Volume 43, Issue 1, Pages 251–251, October 2007.
  208. LiadhKelly. Searching Heterogeneous Human Digital Memory Archives. In: K-Space Jamboree Workshop, Berlin, Germany. September 2007.
  209. Liadh Kelly. The Information Retrieval Challenge of Human Digital Memories. BCS IRSG Symposium: Future Directions in Information Access 2007, Glasgow, Scotland. August 2007.
  210. Emma Berry, Narinder Kapur, Lyndsay Williams, Steve Hodges, Peter Watson, Gavin Smyth, James Srinivasan, Reg Smith, Barbara Wilson and Ken Wood. The Use of a Wearable Camera, SenseCam, as a Pictorial Diary to Improve Autobiographical Memory in a Patient with Limbic Encephalitis: A Preliminary Report. Neuropsychological Rehabilitation, 2007, 17 (4/5), 582–601. August 2007.
  211. Byrne D, Lavelle B, Doherty A, Jones G and Smeaton A.F. Using Bluetooth and GPS Metadata to Measure Event Similarity in SenseCam Images. Accepted for presentation at IMAI’07 – 5th International Conference on Intelligent Multimedia and Ambient Intelligence, Salt Lake City, Utah, 18-24 July, 2007.
  212. Doherty A, Smeaton A.F, Lee K, and Ellis D. Multimodal Segmentation of Lifelog Data. Accepted for presentation at 8th RIAO Conference – Large-Scale Semantic Access to Content (Text, Image, Video and Sound), Pittsburgh, PA, 30 May – 1 June, 2007.
  213. Abigail J. Sellen, Andrew Fogg, Mike Aitken, Steve Hodges, Carsten Rother, and Ken Wood. 2007. Do life-logging technologies support memory for the past?: an experimental study using SenseCam. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’07). ACM, New York, NY, USA, 81-90. April 2007.
  214. Byrne, D. SenseCam Flow Visualisation for LifeLog Image Browsing. British Computer Society (BCS) IRSG Informer magazine, No. 22. Spring 2007.
  215. O’Conaire C, O’Connor N, Smeaton A.F. and Jones G. Organising a daily Visual Diary Using Multi-Feature Clustering. SPIE Electronic Imaging – Multimedia Content Access: Algorithms and Systems (EI121), San Jose, CA, 28 January – 1 February 2007.
  216. Smeaton A.F, O’Connor N, Jones G, Gaughan G, Lee H and Gurrin C. SenseCam Visual Diaries Generating Memories for life. Poster presented at the Memories for Life Colloquium 2006, British Library Conference Centre, London, U.K., 12 December 2006. Memories For Life Website(poster)
  217. Smeaton A.F, Diamond D and Smyth B. Computing and Material Sciences for LifeLogging. Presented at the Memories for Life Network Workshop 2006, British Library Conference Centre, London, U.K., 11 December 2006.
  218. Smeaton A.F., Content Vs. Context For Multimedia Semantics: The Case of SenseCam Image Structuring. SAMT 2006 – Proceedings of The First International Conference on Semantics And Digital Media Technology. Lecture Notes in Computer Science (LNCS), Athens, Greece, 6-8 December 2006.
  219. E. Berry, M. Conway, C. Moulin, H. Williams, S. Hodges, L. Williams, K. Wood and G. Smith. Stimulating episodic memory: Initial explorations using SenseCam. Abstracts of the Psychonomic Society. 47th Annual Meeting, 11, 56-57, Oxford University Press. November 2006.
  220. Hyowon Lee, Alan F. Smeaton, Noel E. O’Connor and Gareth J.F. Jones. Adaptive Visual Summary of LifeLog Photos for Personal Information Management. AIR 2006 – First International Workshop on Adaptive Information Retrieval, Glasgow, U.K., 14 October 2006. (poster)
  221. Blighe M, Le Borgne H, O’Connor N, Smeaton A.F and Jones G. Exploiting Context Information to aid Landmark Detection in SenseCam Images. ECHISE 2006 – 2nd International Workshop on Exploiting Context Histories in Smart Environments – Infrastructures and Design, 8th International Conference of Ubiquitous Computing (Ubicomp 2006), Orange County, CA, 17-21 September 2006.
  222. Steve Hodges, Lyndsay Williams, Emma Berry, Shahram Izadi, James Srinivasan, Alex Butler, Gavin Smyth, Narinder Kapur, and Ken Wood. 2006. SenseCam: a retrospective memory aid. In Proceedings of the 8th international conference on Ubiquitous Computing (UbiComp’06), Paul Dourish and Adrian Friday (Eds.). Springer-Verlag, Berlin, Heidelberg, 177-193. September 2006.
  223. A. Tjoa, A. Andjomshoaa, S. Karim. Exploiting SenseCam for Helping the Blind in Business Negotiations, Computers Helping People with Special Needs. In: Miesenberger K., Klaus J., Zagler W.L., Karshmer A.I. (eds) Computers Helping People with Special Needs, ICCHP 2006. Springer Lecture Notes in Computer Science, vol 4061, pp. 1147 – 1154. July 2006.
  224. Seungwon Yang, Ben Congleton, George Luc, Manuel A. Pérez-Quiñones, Edward A. Fox, Demonstrating the use of a SenseCam in two domains, International Conference on Digital Libraries archive, Proceedings of the 6th ACM/IEEE-CS joint conference on Digital libraries. June 2006.
  225. Fleck, R & Fitzpatrick, G. Supporting reflection with passive image capture. In Supplementary Proceedings of Cooperative Systems Design (COOP ’06), Carry-le-Rouet, France. pp.41-48. May 2006.
  226. Ashbrook, D.; Lyons, K.; Clawson, J. Capturing Experiences Anytime, Anywhere. IEEE Pervasive Computing Magazine, Volume 5, Issue 2, April-June 2006.
  227. Lee, M & Dey, A. Capturing and Reviewing Context in Memory Aids. April 2006.
  228. Cherry, S. Total recall life recording software, IEEE Spectrum, volume 42, pp24-30, November 2005.
  229. Fleck, R. Exploring SenseCam to inform the design of image capture and replay devices for supporting reflection In E. Martinez-Miron and D. Brewster (Eds) Advancing the potential for communication, learning and interaction, 8th Human Centred Technology Postgraduate Workshop, Department of Informatics, University of Sussex, Brighton, UK. June 2005.
  230. Jim Gemmell, Lyndsay Williams, Ken Wood, Roger Lueder, and Gordon Bell. 2004. Passive capture and ensuing issues for a personal lifetime store. In Proceedings of the the 1st ACM workshop on Continuous archival and retrieval of personal experiences (CARPE’04). ACM, New York, NY, USA, 48-55. October 2004.