# LDR Cyclops - holding, scanning, seeking, recalling

ldr_eye_telescope_scanning.bas (3460Bytes)

Continuation from here.

Video straight from the camera to Youtube (Sorry Oddbot).

Continued here.

https://www.youtube.com/watch?v=lnlZtVeDtgA

I like the way this is going!

Awesome post my friend…

I am already gleaning a ton of info from it. First off, I finally know what the 13,10 is at the end of all these sertxd commands are!

Questions:

How did you, if you did, determine the focal distance for your lens?

Is this a true telescope or is it just the one lens?

What are you using for a spreadsheet program and what are you using to spit-out your graphs, like the one above?

Have you tried to or had any problems with storing multiple bytes to a single eeprom address (write command)?

Does fire have mass?

What is the average flight speed of an unladen swallow?

Just thought of something…

Let me ask you this:

If your robot already knew what room he was in, how close to aligning himself to the room do you think you could get using your system. --No comparing rooms now, just aligning to the direction of the prerecorded fingerprint in one given room.

African or European?
African or European?

This is great rik!

Could you explain your data a little more, I’m having some trouble. Is it 2 runs? brightest (dark)?

Have you tried moving it to a different location then moving it back and seeing how close the fingerprint is?

How are your math skills for convolution and matching ?

Have you done some experiments on “blindly” trying to position your LDREye from just the data?

Holy Stereopsis Batman! - have you done ranging to the brightest or darkest point? Come on rik, I’m on the edge of my seat !

unladen swallow
Mike stole my line!

Convolute this!

Are you kidding me?

xor

Are you F’ing me?

xor

Are you F’ing kidding me!?

blind positioning

Not sure what you mean by that. The current seeking routine wil scan right to left in steps of 5 (from 80 - 220). Then return to the brightest step in that sequence and turn back 9 servobits. Than scan 18 bits / servopositions in steps of 1. The position (azimuth) with the brightest rteading is noted.

The servo is then slightly beyond the brightest point (unless situation has changed on him). It will "prove" that is has found something by pointing at it. In this case it first points at a lamp on the ceiling, later at the desklamp. That pointer behaviour is based on the psition stored in memory.

spreadsheet
I use (an old version of) MS Excel, I am embarrased to admit. Occupational hazard. I know it so well, because I use it mainly to produce graphs like this one at work. Mostly server performance analysis and reporting.

multi byte storage

I did not try to store more than one 8-bit byte into a single address. I happened not to need that. Lemme check TFM…

“When word variables are used (with the keyword WORD) the two bytes of the word are saved/retrieved in a little endian manner (ie low byte at address, high byte at address + 1)”

Forget about the politically incorrect word joke (it pre-dates the invention of PC’ness and in fact, the invention of The PC). It explains that storing two bytes in eeprom requires you to store one byte at one address and storing the other byte at the next address.

The write command is apparantly able to do this for you when storing a word value for you. The endian (written with an intentional e) just determines which half-word (or bytes) takes the front seat.

mass of fire

Ask that smartypants son of yours! Or Puff.

For those just tuning in: fire indeed has mass, but not as much as the same volume of plain cool air has. At any given pressure of course.

aligning stored pattern to reality and v.v.

Dunno. That would require (on higher lever cognition levels) that the system recognised two (or more) points in the room that it also correctly identifies in its stored pattern. Say two bright windows in the room, or two black holes in the universe.

There’s gotta be a dumber/faster method than that. Read: lower level cognition.

stereoptical ranging

No I have not. Turn To The Source Grog. RTFC.

I guess a "Four Eyes" Mintvelt construction would be required. Or a single eye sliding left to right (and then turning to reaquire the target). Angular difference between two readings would lead to range. Geometry is easier than integral calculus.Even in a Picaxe I guess.

I suppose a mobile platform (robot anyone?) would be bale to range a light source (or dark source) the same way a sailor ranges a light beacon. Recognising the stationary object is off course crucial, as is good sensing of traveled distance between readings (taking bearings). I suppose a robot could decide not to take its eye off of the beacon. A sharp transtition dark-light would probably be easiest to code for.

fingerprint matching

Not yet. But that IS what this is all about. But first I want hi resolutions scans over lenthgs of time. Just to get a feel of the informational challenge we’re up against here.

In other words: I want scans like this one for every room in the house, for every hour of the day/night. And then see what the natural variations are.

Oops, my TJ parts all arrived today… Fat change homeboy!

data comments

You are looking at two sets of data in one graph. One set called “dark” and the other set “lamp” It’s the actual sets from the video. The dark set is without desk lamp lit. The other one has it turned on.

At servoposition 80 (that’s pulses to RC servo of .8 ms) the scope looks far right (from scope’s pov). As it turns toward the left (2.2 ms pulses), you see the light levels increase. Dead center (150 or 1.5 ms) you see that light levels between the datasets start to differ. The desk lamp starts to get noticed.

The blue graph has a peak at position (azimuth) 171 (square in the graph). The orange peak is later in the scan, or further to the scope’s left at pos 208. It is also brigher. That’s where the scope is staring the desk lamp in the face. Or rather, starts to look it in the face. The peak is three readings wide, of equal brightness 19. The code notes the very first of these three as the ultimate brightest reading. This comes from the
if voltage > brightest then…
Had that comparator been a “>=” aka “greater or equal than”, it would have found the last reading to be the brightest.

In order to make it find the middle one of the three, I would have to smarten up the code. First find the right most and left most, then take the average of those positions. It’s easier to just enhance the LDR resolution in the ADC or tuning the potmeter. That way no two adjacent positions would ever read the same voltage.

focal distance

160 mm give or take a yard.

I burned some hairs of my left hand while also trying to hold a ruler. Later I adjusted the focal length of my telescope (it’s two cylinders sliding in/out of each other, I think it’s called “telescopic”), while reading the resistance on my multithingey or reading the debug data from my Picaxe. Until it indicated the brightest reading I could get by sliding alone.

Remember; I did not try to get a crisp, 1x1 pixel, picture. There is no such thing. I tried to optimize for most light received.

true telescope

True in the sense that the tube is telescopic.

Just one lens. I considered two, hoping it would shorten the focal length of the total system. But when I found the toilet roll fitted so darn snug, I decided it was plenty light weight for my servo. Two lenses are also a headache to align and focus well.

data anomaly

Notice the significantly brighter readings at the rightmost position (80). Both sets show outlyers there and nowhere else. The orange outlyer being much further out of line than the blue one.

I suspect this is the delayed effect of the LDR. It just turned fat from left back to right. In this case (check levels at 220) from bright to dark. The telescope still "sees spots before its eye". More so from looking almost directly into the sun (desk lamp).

One more reading into the left, the effect is gone. I guess 300 ms is plenty if waiting time, given these lightlevels, just not when adjusting from bright desklamp to dark operator sweater.

Any more questions?

I might take a vacation from this project as my TJ parts have arrived. Three packages in one day. But more questions and suggestions are always welcome.

Boy those pulleys are tiny!

Multi bytes at single address…

I just don’t get it rik! I read the same about the word thing that confused me… If indeed there is a one byte per address limit, why did they write the command like this?

WRITE location,data ,data, WORD wordvariable…

Data, data seems like two bytes to me! --Bastards!!