CISC499
projects for Fall 2016/Winter 2017
Although there are 5 project suggestions
listed below, please understand that realistically, I can only supervise only a
few of them.
Any of the projects listed below will be
led by the Poppenk Computational Cognitive Neuroscience (POPMEM) lab, a new
NSERC-funded research group focused on understanding the nature of memory and
the contributions of neural structures to its function. Students will work
directly with the leader of this group, Dr. Jordan Poppenk, who is a Tier 2
Canada Research Chair in Cognitive Neuroimaging cross-appointed to CISC.
For more information about us, please
visit http://popmem.com
1. Open-source
brains: a cognitive neuroimaging experiment database for Python
Our research group has developed an
exciting Matlab toolbox for gathering, managing and analyzing data in cognitive
neuroimaging experiments called "SuperPsychToolbox" [1]. It logs user
behaviour into extremely thorough data files that facilitate integration of
brain data with records of what actually happened during the brain-scanning
session. Using this toolbox, we aim to change the way that psychologists,
neuroscientists and others around the world conduct experiments and analyze the
resulting data. Its features and API are described at superpsychtoolbox.com.
We have a problem: in association with
Intel, our closest collaborators are developing important new neuroimaging
tools optimized for Intel hardware. However, they are pursuing this development
entirely in Python. We need a way to convert our complex experimental data
files, which are currently saved in the proprietary Matlab format with a
variety of complex data types, into Python. Because the same data structures
are not available in Python, a key challenge of this project would be to
develop an efficient Python structure for storing our scientific data. A tool
is then needed to convert our saved Matlab data files into this new Python
format, and to convert back again from the Python to Matlab format. Beyond
serving as a conversion tool, this work will serve as the backbone of a
longer-term effort to port our own toolbox code into Python.
An optional goal will be to create a
simple function that uses the Python toolbox PsychoPy to sample keypresses and
log them to this database, to demonstrate that the data format is flexible
enough to efficiently accommodate new data.
This project is most suitable for a
student with an interest in databases, Neuroscience, and/or Psychology.
[1] http://www.superpsychtoolbox.com
2.
Lifeblogging (telemetry)
An interesting and active topic in human
memory research is the way that the human brain represents space at different
spatial scales [1]. To address this, some research groups have incorporated
ÒlifebloggingÓ into their work, a procedure in which participants carry a
cell-phone around in a way that it can gather lots of images and other data to
be used as stimulus materials. The materials are later presented to
participants as they lie in a brain scanner [2] to investigate the neural basis
of the individualÕs response to the memories.
Along these lines, we wish to create a
custom lifeblogging app to securely gather data from participating mobile users.
The app would use the userÕs phone sensors to gather image, time, audio
(obfuscated), GPS, accelerometer, and orientation information throughout the
day. This data would be automatically transmitted to our server using an HTTPS
POST request whenever the phone was connected to wifi. The app would receive
images from the server (gathered from other users) for periodic memory tests
(was this your own experience, or someone elseÕs?). It would also administer
memory tests by presenting Google Street View images from the userÕs route, and
the route of others.
An optional goal would be to prepare this
app for use with smart-watches, in which case additional sensors could be
deployed (e.g., heart rate).
This project is most suitable for a
student with an interest in mobile development, telemetry, Neuroscience, and/or
Psychology.
[1] http://www.ncbi.nlm.nih.gov/pubmed/23597720
[2] http://www.pnas.org/content/112/35/11078
3. Video
games for science: A city in flux (procedural city models)
ItÕs not easy to test memory for space;
how do you Òreally knowÓ how good a personÕs memory is for a particular
location? One possibility is to manipulate the environment and observe whether
the person notices. But you usually canÕt change the world like this, and
tweaking photographs is time-consuming, often conspicuous, and limited in terms
of which manipulations are possible. What is needed is an environment over
which the user has sufficient experimental control.
A lot of work has been done in recent
years on developing procedural urban modeling tools. With help from Rob Harrap
(faculty in Geology), who has worked with procedural generation and GIS as well
as on urban modeling at a range of scales, you will adapt and develop tools
that generate streetscape environments in Unity3d. The user of the tool is
required to learn by navigating, and the tool will then modify the
environments, and we will test whether the user can detect the changes. The
research question for us is: we hope to use these tools in an experiment that
will help us develop an understanding of how people learn to navigate their
environments, and how elements of that spatial representation get stronger (or
perhaps in some cases weaker) with experience. The development question is: how
do you effectively deploy a tool that can build and then modify - by applying
rules - a streetscape and then track the details of a user interacting with
that space.
As an optional goal, the game would be
implemented as a mobile app that prompts requires the user to perform a short
navigation twice a day (analogous to an ongoing ÒcommuteÓ).
This project is most suitable for a
student with an interest in game design, geographic modeling, and/or
Psychology.
4. Video
games for science: QueenÕs tri-(dimension) pride (photosphere mapping)
How well do you know your way around
Queen's campus? That's a question my laboratory wants to ask new and returning
QueenÕs students using a Unity-based video game thatÕs based on real campus
imagery. We want to learn how spatial representations of our campus change
while people get to know it better. WeÕre also interested in how the neural
representations of memories migrate as they age, from one part of the
hippocampus (an important brain structure for memory) to another [1].
If you've ever used GoogleÕs Street View,
you'll be familiar with the concept of a "Photosphere": a 360¡
panorama that allows you to pivot your view in any direction [e.g., 2]. In this
project, using special camera equipment, weÕll gather photospheres at strategic
points around campus; then, using software, weÕll stitch them together, import
the images and their geotags into Unity, and build a simple experiment that
uses the spheres to test the memory of QueenÕs students.
WeÕll deploy the ÒgameÓ both on a desktop
and mobile platform that will be used by both new and seasoned undergraduates,
staff, faculty, and even alumni; and weÕll eventually gather brain scans to
predict which individuals have the best memories in this task.
This project is most suitable for a
student with an interest in game design, photography, Geography, Psychology,
and/or their campus.
[1] http://www.jneurosci.org/content/32/47/16982
5. Video
games for science: like a rat in a virtual maze (VR)
To learn about human spatial memory, our
lab has recently started to incorporate 3D virtual worlds into our experiments.
We also have recently prepared a VR room with top-end equipment, including a
commercial HTC Vive [1], NVIDIA 1080 gtx card, a Unity environment customized
for Psychological experimentation, and a complete memory game
"experiment" in which human participants experience what itÕs like to
be a rodent finding its way around a virtual water maze (for early examples of
how this has been used in Psychology, see [1]).
We need your help to bring our Òvirtual
water mazeÓ experiment to VR! Doing so will help us address questions related
to the way we form spatial maps in embodied vs. video screen spaces. The
experiment will also serve as a platform for many future experiments on our VR
system.
This will project entail 1) adapting our
existing Unity memory game to our HTC VIVE environment using available software
tools; and then 2) adding levels to the game that better take advantage of
embodied virtual navigation, for instance, allowing you to take note of spatial
information in multiple dimensions. The levels you develop will be widely used
and promoted, and may also one day be used in a brain imaging study. Prior
experience with Unity is an asset, but not strictly required.
This project is most suitable for a
student with an interest in game design, VR, and/or Psychology.
[1] http://www.sciencedirect.com/science/article/pii/S0166432898000199