I was sick for a few days which slowed progress, but here's a quick update anyway.
1) Tracker feature
==========
First of all, the Tracker tab/panel is successfully set up to receive and show data in a time-based graph (with customizable display range). Right now it doesn't show much, though... only labels for "events" that are added to the session data through user scripts.
2) Eye-tracking feature
==========
I've been experimenting (and researching a bit) to attempt to add an eye-tracker feature to the app. Yep, it's what you imagine! Trying to get it so the directions you look in-dream are captured by the app, and able to be replayed in the morning.
I've had mixed, but mostly good, results so far. Not in actual dreams but in a lot of daytime testing, and iterating on the code.
Some things I've so far discovered:
A: You get much better (~3-4x) sensitivity when the device is resting directly over your eyes instead of your forehead. Some people might be concerned with this, but at least for now I've been using it for testing. I wouldn't sleep on my stomach with it, but it seems like it would be safe enough if on the side or back. Note that it's not pressed into your eyes--it works fine just barely contacting the surface, so I just let gravity hold it there.
B: My naive, initial conception of the eye muscles as just "mirroring" each other's movements was incorrect.
That is to say, when you look left, and the left-eye's eeg channel change is +10, the right-eye's is NOT -10 as I assumed. Instead, it's something like -8.5. I was confused why at first, but then I looked up illustrations on the eye muscles, which you can see here: Eye Muscle Control
That provided a visual base from which I continued testing and conceptualizing. Then I realized that the drop in the right eye's eeg was not in some sort of "opposite movement", but rather from that the outer "Lateral Rectus Muscle" was substantially closer to the device's electrode than the inner "Medial Rectus Muscle", and therefore the drop was just from that outer muscle relaxing--to let the inner one (still on the right side, but toward the nose) meaningfully tense and pull the eye to the left. In other words, the muscles that pull the eye, tense more strongly eeg-wise than the alternate set relax, and that's what causes the eeg to "not balance out". (after all, as mentioned on that site, the eye muscles are always tensed to some degree to keep it steady, and they just fluctuate in strength to yield the movements)
C: So what happens when both the left-eye and right-eye eeg channels show an increase in their values? At first I assumed it was when both eyes tensed, which I assumed meant rolling the eyes up to look higher. Well that's true to an extent; if you look far up then far down in your field of vision, you will see this effect--but it's relatively weak, especially for the middle 50% of the vertical visual range.
Instead, what I just discovered tonight, is that an increase in both corresponds a good deal more strongly to a "focusing on something farther away", and a decrease as "focusing on something closer" (like when crossing your eyes).
Now in my opinion, this is not quite as useful as glance-up and glance-down detection (as that would have meant you could try to outline shapes and have the outline recorded and the like, which would be awesome!). But, we'll take what we can get. And what we get in this case is still pretty cool: Being able to tell how far away the stuff we're looking at is throughout the dream!
So imagine flying high above a city, dropping down to a beach, looking around at your group of friends, then looking closely at your hands to bring stabilization and deepening. It's not yet proven doable, but it would certainly be cool to have an app record left-right eye-movements + view-distances from this period, in a playable recording, and a log along the lines of: "view-distance at ~100 meters, view-distance decreases to ~5 meters, eyes look left 20 degrees, eyes look right 40 degrees, view-distance decreases to ~10 centimeters"
If it worked, I think this would be really helpful for piecing together dreams in the morning.
(I highly doubt it will work as well as the above; although I'm pretty confident from the results I've gotten so far that it will, when mature, have some level of usefulness/practicality in this area of dream-recording.)
3) Other stuff
==========
Finally, another thing related to LD for me recently has been trying the DEILD technique(s) detailed by Michael Raduga in his book The Phase.
So far, it's been pretty promising. One of the most interesting for me was his statement that whenever we wake up, we basically are still in the phase! Or at least "close enough" to it that we can often get directly back into a dream within seconds. So the idea is that because you're so close anyway, all you need to do is some basic things and a good percentage of the time you'll be able to get back in, despite it feeling like it's all over. (and when it fails, don't spend any more effort, and just go back to bed and remember to do it again next time for another chance)
And I've found this to be true. At least it seems so, from the two times I've succeeded in the last few days. The last one, I got back into the dream like 4 times. Of course, I've done this before, but this time I was more confident in it, because of the optimism that "even if you wake up, you still have a good chance to get back in!".
I eventually woke or seemed to wake in bed the fifth time, and decided to end because I had a good time and didn't feel like working my way back into the dream again. So I did, and am looking forward to trying it again some more tonight/tomorrow morning.
Anyway, it's all very interesting, mixing research/software-development and personal exploration/relaxation (and all the other random stuff mixed in) in this LD arena.
Good night from Seattle!
|
|
Bookmarks