00:00.00 Paul Hi welcome back to the architect podcast episode one seventy four Chris just before going to break you were talking about this rocket ar glasses and you compared them to the oculus and I was going to ask you the question you kind of touched on it. But if you could go into more detail I'd be curious. Ah, you got that oculus specifically because you didn't have a good place to put a good size monitor in your rv you were talking about like putting it up on the dashboard and it would overheat so the oculus allowed you to have multiple virtual screens and the last we talked about it. 00:24.70 archpodnet Right. 00:34.90 Paul So I didn't get to play with it when I was in Nevada that's a missed opportunity the last we talked about it. It was working really well for you as an external monitor I don't know if you're still using it like that. Um, but if you could just give me a little bit of ah, a sense of like how these new glasses compare versus. 00:34.96 archpodnet Oh shoot. Yeah, ah. 00:42.89 archpodnet Um. 00:52.34 Paul The old glasses in that in that use case. 00:52.59 archpodnet Yeah, well, the first thing is the new glasses I can plug into anything with a usbc port technically you could plug them into like you know a playstation or an Xbox you could plug them straight into a Tv if you wanted to I don't know why you would because a Tv sit right in front of you. But you can plug it into a phone tablet and I've done both of those things and it works really well that way the ocuosquest you simply can't do that too. It's not something that you can use as a monitor in that sense now. There are ways to do that if you have a pc and some remote desktop stuff. But it's convoluted and and doesn't make any sense for most users. 01:17.44 Paul Me. 01:25.88 Paul This is. 01:30.76 archpodnet So the other thing with the oculus now the digital office that I use is called immersed vr ah, and there's a number of them out there but they're all basically the same the oculus is all encompassing. So if you're not super good at touch typing without looking at the keyboard. Even though you can get an on-screen virtual keyboard and there's ways to do that with certain lg keyboards will go into vr so to speak and um, you can kind of draw the keyboard out in immersed. But it's not super perfect and it's hard to get used to to be honest, but the benefit to the oculus is. I can spawn 5 displays one of which is my primary display and then four additional displays and I have full control over where those are whether they're portrait or landscape and how big they are and all that stuff with this rocket air glasses I basically just have the one display on 120 degree field of vision and that's it right? So if I'm able to. Use it as a second display then that is a true second display with pretty decent clarity. Um, or I can just use it to mirror. Whatever I'm looking at you know, whatever device I'm on now the 1 cool thing is you know with my ipad I hooked it into that and it was just mirroring the ipad but when you run the. 02:27.75 Paul M. 02:42.32 archpodnet Brightness all the way down on the ipad you can still see the screen. It's super dim but you can still see the screen when I run the brightness all the way down on my 2021 macbook pro the screen goes black like it literally turns off the screen and if I'm mirroring that display. Well I can now use this with my ar glasses in like a coffee shop on an airplane. 02:51.24 Paul Me. 03:01.62 archpodnet Something like that without having anybody Peer. What's on my screen you know because we've all been sitting on on a on an airplane and look up between the seats and go what is that guy doing with those spreadsheets. His formula is totally wrong, but ah, you know like they're just guy getting ready for a high pressure pressure meeting. And I'm just like looking at your stuff so you know the the privacy aspect is greatly enhanced with this I like that and they're lightweight They don't have a battery that'll Die. It does suck battery life from your from your device. Um, but I feel like the field applications to this could be pretty good depending on the data that's coming into them. You know if we can get some like Ar out type apps and I can just pull these glasses on while I'm standing in the field and and visualize all the data I'm looking at in an augmented reality sort of way that's kind of where I'm hoping technology like this goes. So yeah. 03:49.26 Paul Um, um, so is it over one eye or both eyes the ah the virtual screen. Okay, and what are the yeah things when we first started talking about like augmented reality glasses. Yeah was the the. 03:55.20 archpodnet Yeah, it's stereo So both eyes and it and it brings those images together. Yeah yeah. 04:07.15 archpodnet Yeah, yeah, yeah. 04:07.52 Paul Terminate or overlay you know, get information about what you're looking at would that be a possibility with these and like hook it up to your to your phone and have while you're doing your survey or whatever have an on-screed readout. 04:17.85 archpodnet You know I don't see why not if the application was designed in a way that really was like ah like an overlay where the middle of the screen was left blank these are and you can see through the image. Um. 04:27.92 Paul Um. 04:34.82 archpodnet If I put up a a white window or something like that you know with a white background. It's really hard to see through that so you'd have to design something that really does if it truly has a transparent background and and no like you know, desktop image or something like that and then you just have the overlay information around the edges. Absolutely it would work. There's There's no reason why I wouldn't. 04:37.33 Paul Um. 04:54.33 archpodnet So the thing is it's not interacting right now with the actual environment. You're seeing stuff that's coming from another device but it's not like identifying mountains in the distance or you know able to look at something and get information on it something like that. It's only passive. It's just displaying What's coming from its own device. So. 05:00.54 Paul Um. 05:07.70 Paul Gradcha. 05:10.98 Paul So I could so see that being useful for biometric information temperature which you know you get off your phone. Um altitude Gps coordinates you know number of things like that. But yeah, if it was. 05:13.60 archpodnet Yeah. 05:21.66 archpodnet Um, yeah, yeah for sure. 05:30.21 Paul Have to be designed in a way that it doesn't interfere with what you're actually trying to do but just kind of augments it no pun there but you know off to the side. 05:38.15 archpodnet Ah, as though your ah your reality were like enhanced or augmented it's crazy right? It's really weird. Um, yeah, how about that so you know what would make these glasses really good though is if we had a lot more data imagery data that we could use. 05:45.10 Paul Ah. 05:54.64 Paul Ah, nice. 05:57.15 archpodnet Maybe then identify things just by look could you imagine looking out on the landscape on a fresh landscape that you're just about to survey and having your your your glasses your device that you're you're using start identifying previously unknown archeological features that is the future. Yeah. 06:12.25 Paul Oh thatd be really cool, kind of a heads up display right? So the same thing like the fighter pilot looking through the Windscreen and seeing and seeing you know the enemy planes being identified and picked out even though there's just little dots off in the distance. Ah yeah. 06:16.98 archpodnet Exactly. 06:25.30 archpodnet Yeah, exactly except except it's actually doing like real-time identification. So I think that's probably a number of years down the road but the foundation of stuff like that is in the paper. We're going to finally talk about a halfway through this podcast. So Paul why don't you ah tee that up for us. Ah. 06:40.71 Paul I argue this up for you. Okay, so this paper I'm not sure how I found it because it's in a journal called the remote senensing journal or the journal of remote sensing so it's not an archeological journal per se the the lead author on it. Mark altow wheel. I follow him on Twitter so I assume that I came across this article based off of a post of his online and the article's title if you didn't get it from the title of this episode is automated archeological feature detection using deep learning on optical ua v imagery. 07:03.27 archpodnet Um, yeah. 07:17.75 archpodnet Um, da Um, yeah. 07:18.10 Paul For preliminary results which is one hell of a mouthful. Ah, but but it's interesting. You know we've we've talked about other kinds of deep learning identification of visual information and photographs and such to find like lithics. For example, we've. There's been discussion. Not that I think we've ever talked about it but there certainly has been ah, quite a bit of work over the years on using pattern recognition for identifying sites and that's basically what they're doing in this and they're describing their particular process and not just the process but also their software. 07:48.31 archpodnet Ah. 07:55.90 Paul Because it's written by at least the the lead author. But it's there are a half dozen people on this paper from the Middle East from Germany and from the Uk and I'm not sure how much they all contributed to this but clearly it's it's a collaborative project. 08:10.76 archpodnet Ah, yeah. 08:15.49 Paul That they're that they're working on and they um and they have their software. Their software is on Github the article especially in the conclusions. The discussion, the conclusions goes on at length about the need for more access to more data. Um. 08:31.63 archpodnet Ah. 08:33.77 Paul And so I just thought you know it touches on a lot of the things a lot of the article frankly again went over my head and I think part of that is because there are a few terms that aren't that aren't defined in it. Um, but to tie this back to what I said early on about the the architecture detection subsurface at Lagosh. I'm looking at various kinds of optical recognition programs to try to get a sense of if that's going to be something useful at logage for finding these buried structures. So I just read. Um I just read an unpublished article that. 09:04.91 archpodnet Oh. 09:12.91 Paul That Emily Hammer did and will be published. It's excellent. It was ah on laage on work that she did a couple years ago there and part of the project that she did was tracing a lot of these buildings these walls and she did it in a manual process. Playing with the the contrast and the histograms for the the imagery and seeing you know where they where she could see walls and certain pictures and not in others and or where she could see them in multiple and then have have greater confidence and I'm thinking that it'd be interesting to try to. 09:35.97 archpodnet Ah, yeah. 09:45.37 Paul Do the same sort of approach but in ah in a machine learning environment I know Jack about machine learning. but but I listen to a number of python podcasts and and gis podcasts and I read this? Um I mean I'm adjacent to these worlds. So I'm need to start learning this and I think that. 10:00.47 archpodnet Yeah. 10:04.77 Paul This is going to be a good project for me going forward. Anyhow, this article caught my attention because in that title which mentions a bunch of different words a bunch of those words are things that are very adjacent to things that I either do or want to learn about. 10:18.85 archpodnet Yeah, well this is really cool too because like I said when we teed this up here I mean I first started I personally first started thinking about stuff like this when we worked in at China like never weapon center back in 2015 and I was just thinking. Man this is so dangerous with the snakes and the bombs and and just the environment alone that if only we could have some sort of drone imagery and then be able to actually identify stuff using that drone imagery now in the very limited sense I was thinking of just. 10:42.71 Paul Um, me. 10:57.81 archpodnet Visually looking at like video imagery that was taken like highol resolution video imagery and doing that instead of survey right? So you could zoom in and and you know try to see stuff and figure things out. But of course if you could teach a computer to do that. It's going to be way better than the human eye is ever going to be at some point. Um, in fact, when I was reading this article and they were talking about. Training these models and I kept thinking of a you know like a movie scenario where you know the research is getting frustrated because it keeps training and trying to tell this thing and the computer keeps coming back with incorrect responses saying oh you've got features over here and was like no, we don't because I didn't give you those images and and then all of a sudden. The computer is like. 11:16.39 Paul Listen. 11:35.86 archpodnet You know the guy realizes Oh man, the computer's been right? All Along. We've missed this thing forever because we couldn't see it and the computer can and I was like that plot just writes itself. But I think that's what's going to happen right? Like we get these things smart enough to do that pattern recognition on the types of patterns that we're looking for. I mean we should be able to we really should be able to not just identify things in a quicker way than we're doing now and cover a lot more ground but my hope is we'll be able to find Stuff. We didn't even know existed and and maybe didn't even know where like human created features until we had these algorithms to identify them. 12:07.24 Paul Um, yeah, um, and a lot of this article is actually about building the pattern recognition into the system right? um. 12:11.58 archpodnet You know what? I mean. 12:20.19 archpodnet Yeah, yeah, yeah. 12:23.43 Paul So it still isn't as good as what people do, but they're trying to make it get there. Um, and so it's it's interesting from that point because it's It's very much a work in progress. Fantastic kudos that they have the the code available online I tried downloading and running it. But um, but getting Qt which is the. 12:42.53 archpodnet Ah. 12:42.92 Paul Graphical interface working on my Mac was just a little more than I had the time to deal with right now because of everything else going on. Um, but I will at some point I mean the the screenshots are all on Ubuntu which is my flavor of flu the kni of choice. So I'll grab in abuntu box and try it there. 12:47.98 archpodnet Ah, yeah. 12:59.87 archpodnet Ah. 13:01.54 Paul Um, and they do you know back to your thing about having the the augmented reality glasses. Yeah we're not there yet because and they discuss this in the article they're they're talking about doing some of this on you know, whatever computer you have handy. But then also offloading a lot of the the learning and the pattern recognition to higher performance computers and clusters even right? So yeah, we're not there dad that's not going to be on your glasses for 10 years that that kind of programming power processing power but you know it will be there eventually and. 13:18.98 archpodnet Yeah. 13:26.33 archpodnet Sure. 13:33.93 Paul There are also ways just like what they're talking here about you know, moving workflows between one device and another there are going to be ways of offloading some of this processing power to stuff that you're not physically carrying around with you. You know? and maybe it's through that Starlink link back to a super computing cluster someplace else in the world. But. 13:45.24 archpodnet Huh. 13:49.50 archpodnet Yeah. 13:52.27 Paul It processes the images uploads it the actual intelligence happen somewhere else and then sends it back very quickly so that you can identify what you're looking at I'm actually I'm going to sidetrack us again because that seems to be my job lately when you were talking about identifying things that we don't see with our eyes. Ah. I was listening to a webinar yesterday about a historical archaeology project in in Rhode Island and they showed some lidar imagery onto which were laid the maps of what they were doing of where their trenches were and so on and and it looked nice and and I thought oh. 14:24.23 archpodnet Time. 14:30.99 Paul Lightar imagery I forgot to think you know what kind of lightar imagery is available for my area so I was working on a project next to the Hudson River back in October November December you know so either side of my last trip to Iraq and I decided to go on the New York states 14:33.73 archpodnet Yeah. 14:42.78 archpodnet Oh. 14:50.36 Paul Gis Department's website and I found that they do have links for various kinds of of imagery of dems and such and I downloaded a usgs one beater process dem 4 different tiles of it pulled them together merged them so that they're all on the same histogram and. 14:59.82 archpodnet Okay. 15:09.24 Paul Dropped them over the area where we were working you know and I looked at them in black and white. You know the gray scale that you normally get from a dam and I'm like okay if fine I changed the color and then I went I'd normally well let's see then I went and I did a contour view in qgis because now that's just a visual visualization. You can just. 15:14.40 archpodnet Picture. 15:28.80 Paul Turn it on like you change the color ramp and then that looked interesting and just for yucks I put it on the Hill Shade model view. Normally I don't like Hill shades I have a fundamental opposition to them because the way that they look I mean it's stupid. But the way that they look normal. 15:35.45 archpodnet Um, ah. 15:47.75 Paul Is with the sun. The lighting source being up to the top either top left or top right? because but in reality the sun should be in the south. But if you light a hill shade from the south. Everything looks inverted hills look like valleys and valleys look like Hills because of the way our. 15:51.96 archpodnet Um, yeah, sure. Um, Ah yeah. 16:06.82 Paul Brains process you know where light is supposed to come from. Um and so I never use them but I put a hillsh shade on and holy crap I could suddenly see these hundred year old roads that we found relics of clear as day you know. 16:19.22 archpodnet Oh nice. 16:24.98 Paul All sorts of details and I was just kicking myself that we didn't do this before going out into the field and and you know doing our test trenches and the like because we could have targeted so much better in so many cases where we actually did doug and did our work had I seen. This kind of imagery before actually going out in the field. Ah yeah, geez because all the other imagery that we had were either historical aerial photos or um or you know like Google satellite googler satellite and bing satellite and such which you couldn't see through the ah the tree cover but this dem was. 16:43.43 archpodnet She's yeah yeah. 17:00.36 Paul You know at one meters you could absolutely see all sorts of great things. 17:03.90 archpodnet And see this is what I think the authors of this article are getting to and as we go to break here I'll just have you know one last thought on this and then we'll go to break but I feel like as archeologist I would love to get to the point where you know we can. We can. You know, Honestly, if we're talking about automation the drones would probably be automated too. But basically we send out the fleet of drones to take our you know various imagery not just visible imagery but like you were mentioning earlier doing some infrared you know multispectral kind of stuff and do all that imagery and then. Probably at that point if we're capable of doing all these things automatically realistically probably sending the information back to satellites in real-time but at the very least downloading it when you get back and then running that imagery through all the processing stuff but basically just hitting a button that says. Yeah, do all these things and having the computer do that rather than us having to Think. Oh Let's go in and and invert these colors and try this and do that and like the one researcher you mentioned that was you know mentioning messing with the histogram. We do that with rock art all the time right to to what is that? What is the name of that program. Um, there's a program you can download. Yeah, it's escaping my mind right now I know it's on my iphone. But. 18:07.30 Paul Um, ah yeah. 18:18.14 archpodnet You use a program to basically change the colors because when you're using ah old paints and and different peckings and stuff like that Sometimes the human eye just can't see it and you need to change colors and do stuff to be able to see stuff. But if we had the right patterns in there which is what these researchers are getting at. 18:26.90 Paul Me. 18:36.70 archpodnet And we had enough of those in there that it could learn then it could run through all these different things on its own and come back and say listen in this I found this and this I found this and in this I found this go figure it out. Mr. Archaeologist or Mrs Archaeologist so um you know a rather doctor. Sorry ah. 18:51.39 Paul Doctor Yeah, don't use the gendered language. She's ah. 18:54.18 archpodnet Sorry yeah I know I know right? So um, anyway that sounds like a good point to stop. Let's wrap up this article because I like where it's going. It's getting in some pretty cool spots and we'll talk about that on the other side of the break.