Transcription provided by Huntsville AI Transcribe
So for tonight, I was just going to go through some of the data processing that I do.
I just want to show you briefly. You can go get these models.
Of course, you can purchase them.
And then they integrate and say, like Photoshop, and then I use some of the excellent sites.
I don’t actually have all of these.
I’m just going to demonstrate some of them. I actually try to use a lot of open source software. And then at the end, I just want to kind of go over as kind of a finale, taking an image and then creating a video with it. I thought that I just did it for fun on some new hardware I got. And I was like, this is cool. And I just thought you guys would enjoy it. Without further ado, so without further ado, I’m Lauren. I’ve got about six years in processing Azure Photography images. I’d say the first six months is all just getting the right hardware, integrating the hardware, integrating all the software. It is a serious struggle. To that point, debugging. So I think it was about a year and a half now ago. I couldn’t get them out to move. I couldn’t move the telescope, basically. And I was like, what is going on here? I changed out the firmware.
I wiped the entire OS, changed out all the cabling, everything. I finally was like, I must have messed up the firmware somehow. ordered a brand new chip for the mount. When I pulled it out, it was an actual bug. An actual bug. And it should have been fried. And that’s why I couldn’t connect to the board.
You cleaned it and it worked? What? You cleaned it and it worked? No, I pulled out the whole board and put in the whole new board in. And it worked out. Thank you. Yeah, so it worked out. The astrophotography is very much a character building. experience overall from even like just sitting there and looking at something that’s like millions of light years away that you can never touch that you will never you will never get to go to go to and humans will never alter or that’s great so but you’ve got to stay resilient you’ve got to you know put in a lot of hard work it’s a serious struggle so you have to be resilient and yeah and then I’ve got this grace hopper moment up here on the right When I literally pulled the bug out, I was like, oh my gosh, this is like a great spot for a moment. So we’ll circle back to this in a little bit. So how do you capture an envelope?
So it’s a series of things.
It’s all the hardware and equipment.
It’s the software. It’s all of the processing and AI techniques. And then at the end, I’ll show you kind of like a comparison against Hubble and what that looks like and then how I created some videos. So the equipment. So the hardware beast. So you’ve got the mount to track the stars.
You’ve got the optics, which I would think of as like a light bucket.
If you want to think of it that way the whole way through, that would help you. So you’re collecting light and you see the secondary and primary mirror there in the optics.
So the light’s coming in, hitting the big mirror in the back, coming into the secondary and then going to the camera basically.
And then you’ve got the guide scope that’s on top of the scope to help literally hold the image as still as possible while you’re capturing. Then there’s a whole host of the capturing side of things, which is the filters, which if you guys are familiar with radars and things like that, because we know we’re in this town, there’s filters for radars, right? Okay, well there’s filters for astrophotography. And then you’ve got the main camera, the guide camera, and then basically focusing.
right?
So I even have it down to a motor that focuses the stars and I’ll show you exactly how that works in a little bit. So this is all of that hardware integrated together.
Remember this takes about six months of purchasing and nailing all the items down and integrating everything together.
So once again on the left you’ve got kind of a light bucket on the far that small little image on the top there.
And so just consider this as a big bucket that you’re just trying to capture all the light from.
So that’s going to come in from the stars, the light coming into the optics, bounce, and then come into the camera. And there’s your camera with an auto-focuser filter wheel. And a focal reducer is just something that I would say basically it increases your f-stop.
So it’s more like increasing your brightness, I would say. And then you’ve got the guide scope, which is basically keeping all of this extremely still for about two minutes, while you just kind of, for lack of a better term, this is digital, but keeping the lens open, right? You’re just kind of keeping your lens open and everything, really still. Okay, and then you have your dependency help. Which is… No, this is why I’m like embracing you all for the fact. Okay, so… So yes, you’ve got a lot of software that’s gotta all work together.
It’s mostly open source, except maybe the processing side of the house.
So the mount driver, keeping that up to date and firmware in there and keeping the stars tracked with your mount, then you’ve got to get your mount in the reference frame of the stars. And so that’s, I use sharp cap for that. And then the camera is ZWO.
And then there’s another guiding software that’s open source called Push Your Dummy or PhD. And then they all have, some of them have a pretty good sense of humor.
I use Stellarium to select the object.
Usually, maybe even two hours before night falls, I’m just going and looking at the sky’s clear.
So what can I look at tonight?
What do I want to look at?
And then there’s a whole automatic caching of the image overnight.
So you’re going to do this for eight hours. And you’re going to be asleep because you’re going to want to sleep. So that’s Nina. That’s number six there.
Then, um, so now you’ve spent like eight hours capturing all the state overnight.
Now you want to get a good signal to noise ratio.
I use deep sky stacker for that and stack everything together.
And then the fun part begins with the pics and site processing images and the AI, which is that our RC Astro image. Okay. So the struggle is real.
So this is the whole process that I go through, um, beginning to end every night. I have found the best avenue is to literally leave your setup up.
Everything connected almost down to the wire and cover it. And so when, you know, when it’s windy and there’s tornadoes coming, I’m like, oh, dicey. But I actually still, majority of the time I leave it up.
Yeah, so. Yeah, when you get it all working, it feels a lot like eating. It’s like, like, ET is working, it’s working. And most of the time it’s pretty stable now. So once you get it all integrated and it’s clear skies, which is that astro-spheric link there that you guys can check out.
So once it’s clear and everything’s good to go, this is kind of my setup. I’ve got a security camera there on the left. And then this is Nina. And the image there is a… and we’ll get into this this kind of stretched image so I can kind of see that I’ve got the right target it’s framed properly and so this is a very altered image just want you to let you know that um and then there at the bottom is a guiding that’s a guiding error so you’re trying it’s like a game you’re trying to get below one all the time So that’s just that’s just there for me to be like, okay, everything’s tracking properly or we got a real good still image and then on the right is that auto-focusser there that I’m always watching that when I kick off. It’s it’s a hyperbolic fit.
So it’s basically machine learning there where you’re just trying to make sure that the star is pinpointed.
Any questions? So the star that you want is pinpointed.
Uh, it just selects it.
Okay.
It selects this on its own.
Take care of it. There’s a lot of just nice to have in here. That Nina handles for you.
Would you mind sharing these slides on the discord or something?
Oh yeah, sure.
I’d love to.
Okay, so I’m not going to go through this. Um, I have this at the top of every slide so you can kind of see the whole process beginning to end from I’ve got the image from last night, right? And I’m going to process this all the way to the very end so you guys can see how does that happen. Basically, it’s the data science data pipeline for Azure.
So you spend all night, thank goodness you’ve automated and you wake up in the morning, you’re excited, and you’re like, what image do I get?
You got the one on the right.
You got the one on the right. This is a low scale, for lack of a better term, low scale image of the eagle nebula.
And again, you’ve kind of captured all of this light in a bucket, right?
So it’s still there. So if you want to think of it as like a little bit of light at the bottom of the bucket, and you kind of want to scale it up and push it from your up in the cup, right?
I’m oversimplifying, I apologize, but I just want to make it simple for you guys. But that’s what happens when you go the high scale. yeah okay pretty cool so this is the eagle nebula it’s one of my favorites it’s in the summer so a lot of the nebulas are in the summer um in the milky way right so when you when you think of the milky way you think of nebulas when you think of dust right well this is one of the things that’s up so i’m always really excited about the summer uh and just for fun note the winner is mostly galaxies right so if you’re more interested in that that’s more So keeping it simple, like you said, that stretching is kind of like balancing levels.
You’re stretching the plank curve, right?
So you’re taking that light. So if I go, if you notice this, um, like her right here on the left, you’re so shifting it to the right. Yeah. Yeah. Cool. So that’s now you’ve got a stretched image, but.
It’s not as clean as it could be and you’re only basically, in this case, you’re basically just the red channel, which is hydrogen.
So you’ve got to do that for all of your colors and everything. So now I want to take a black and white image and I want to reduce the blur. And so this is actually a GIF. So you can kind of see it once the blur has been removed, the stars kind of brighten up a little bit. And then so that’s the first thing you want to try to remove as much noise very early in the process as possible. And then this is the noise.
So there’s how noisy it is when it comes out.
And then your room.
And these are AI models, right?
This is that AI blur exterminator that I mentioned earlier.
And there’s your noise exterminator.
I’m sorry.
Go ahead. You usually go blur first, then noise, your family works best, or… Yeah, and mostly it’s like an art, right? Obviously it’s photography, so it’s an art. I watch a lot of YouTube, and I basically figure out what somebody else’s process is, and then I kind of modify it to my own liking, but yeah, that’s what they do. Yeah, exactly. They usually do the blur, and then they go… Yeah. Okay, so… For lack of, you know, oversimplifying again, but there’s actually some math in the back end here.
But hydrogen is kind of your red channel and then you’ve got oxygen is green and sulfur is like blue, I believe, roughly. And so you kind of want to combine all of them. You would use an LRGB recombination to that.
And then I’ve shown you the wavelengths that are up here that are associated with these three channels.
So it is a color cap.
It’s actually a black and white camera, and then you put basically the narrow filters in.
And then after all that hard work, you’re actually going to end up with a green image. Why? Because the hustle is light polluting your night, and that’s the ear glow there on the left. And so you want to clean that up. And there’s another tool.
This is not an AI tool. This is just a tool that comes with Pixensight to reduce your noise and clean up your color. And then, OK, so now I’ve kind of got this color image. But now what I’m going to do is further contrast it more. But I want to remove the stars. Well, here’s another really good AI model.
I’ve actually used ones that are not AI.
And they chronically mess it up because not all stars are the same. But the AI models are just so generalized that they easily remove it.
So it makes it a lot easier to get the image on the right.
And yeah, these neural networks I’m showing you are trained at James Webb and Hubble.
And then now I’m going to try to improve the contrast with this high dynamic range multi-transform.
So you can see there on the right, the contrasting that occurs.
This is also not an AI algorithm.
It’s just a, it’s part of PiX inside the image. What I’m also doing that I’m just kind of highlighting there on the left is there’s ways to create masking. so that you can just do the contrast on the object that you’re most interested in.
And so that’s the process there on the left.
It’s fairly easy to do that.
But once you’ve kind of isolated your object, you wouldn’t want to like do a high contrast on noise in the background, right? You want to kind of focus on that. So that’s why we mask before we do that. OK, and then this is kind of a hint to some of the math that’s in the back end.
I’ve got some equations that I use to do this.
But there’s something called pixel math that you can use, put the equations in there, and basically add the stars back in when you’re done.
So that’s me adding the stars back in. And then I’m trying to bring everything together. So I’m taking all that really high contrast that we did. using the color and then I’m recombining from left to right.
So for the whirlpool galaxy that’s on the left there, I’m doing more of a color high contrast on the right and the same with the Eagle Nebula.
Yeah, and so this is that final image and the Eagle Nebula was discovered in 1745.
It is one to two million years old and 5,700 light years away.
So that’s always very striking to me.
It’s a humbling experience. Maybe you’ve been at work all day, or maybe you guys have kids, and things are stressful, and you guys have personal lives. And to me, I’m sitting in the backyard, and I look up at this, and all my problems melt away. This is 5,700 light years away.
No one’s gonna mess with it. It’s gonna stay just the way it is in its beauty and glory. And it’s right there. It never changes.
Okay. The Hubble is on the left.
The middle is another astrophotographer like myself, Trevor. And then that’s for the backyard. So that’s pretty cool. Yeah. Yeah. I did a lot of contrasting on the right there. Like I did a little bit more just to get that image right and then kind of circling back to originally and this being like a character building experience there’s a quote from Carl Sagan talking about how astronomy is very compelling and character building um and then he states there’s perhaps no better demonstration of the folly of human conceit than this distant image of our teeny world To me it underscores our responsibility to deal more kindly with one another and to preserve and cherish the pale blue but not the only home we’ve ever known Right and then as I promised This is also the Eagle Nebula This is on my YouTube channel that I created just for you guys Yeah, um, yeah, there’s no there’s barely a view on you. This is the original image, right? This is what the diffusion model this is one two point two and I use something called comfy UI I can I can give Jay some links So what I did is you might notice that that’s about 15 seconds, but some of these diffusion models only work for about five So I took one image and let it do it and then kept generating out. Several times that goes maybe… There’s some other ones. This one’s the Orion Nebula that I showed you before.
I always kind of think of it like, it feels a little bit like Star Trek when it infers the… There’s a music choice there though. I’m seeing it down at the bottom left.
What?
No soundtrack.
Yeah, there’s a soundtrack, but I think we said not little echo, right? So the colors are added afterwards. You get three black and whites and you get sulfur, hydrogen and oxygen. Right. And then you’re going to basically de-noise it and de-blur it and get as much noise out of it as possible. Then you combine it into a color. and you kind of hold that color one aside and there’s a way to extract the black and white again and you come out it’s basically it’s called luminance but it’s just a high contrast image of the color and then you alter that on a bunch and get really high contrast and then put it back. I think it is kind of weird. You take in some of these things you’ve got light waves that aren’t things that we see and you have to assign a color to it or something. Yeah, I thought maybe that even a Hubble would not actually pick up color. Correct.
NASA adds it afterwards, I’m thinking, based on what, how do you, how do you come up with that? I mean, the pictures are impressive, but I get no wondering, especially when you’re talking about 5,700 light-years away. Well, in some of it, you can kind of calculate by, okay, this element will look like this.
Right, but there’s actual, what I kind of oversimplified there is that there’s an equation for red.
And it’s called the Hubble palette.
So I’m actually following Hubble’s color palette.
But you can use other ones.
There’s one that they call SHO for sulfur hydrogen oxygen.
And I guess that’s like the balance of it.
And it’s more like this color.
This is the show palette, or SHO palette.
So basically what I’m getting at is you decide the colors.
Okay, okay. Even James Webb is infrared-based.
That’s right. So it’s not real. It’s not what you think of as our… Right, right. Yeah, I thought I was reading that they would use someone a newer telescope that can pick up infrared and different spectrums and then he can buy a telescope to look at what it would look like. So come up in your backyard, that’s pretty impressive. Yeah, right. Yeah, it did take I should probably you gotta go through the whole atmosphere and all that there’s the worst Yes, yes, I think the hardest part I would say Is probably getting everything in focus and Tracking down like getting those two skill sets down and getting that image to be very very still Is probably your hardest part because initially I got so many blurry images or things weren’t tracking and then when that happens you’re trying to do that right for eight hours. So your data set gets really shrunk down and then it’s not so good and you don’t get this clarity and you don’t get. In one of your readouts earlier you show things like let’s say there’s a five minutes wind that kind of shakes it and you’re like hey I can just cut out that section. and still have all the rest of it. Is that easy to find or identify?
Yes, but I actually have to handle that manually.
And every time I do it, I think to myself, I wonder if I could get a model to do this.
So what I see, David, is the star’s trail.
So it’ll when like you get strong wind or trees, trees are very peculiar because Maybe the image before you were fine and then as you creep in it look it’s almost like a cloud right as you creep into a tree And what you’ll do is it’ll start trailing and you’ll lose it and then Hmm between clouds or trees. I would say the rest of your data sets go So you just like oh, I’m hitting it and I have to delete a bunch How big are these files? I think I told Alexa this once I remember teasing you because it was like, that’s why you have a NAS. I can’t remember what you said now. But that’s all three channels, the whole night. That’s just the raw data before you play with it.
And the great thing is because this thing doesn’t change that much, next year you can add. You just keep adding. So you want to keep it around. So how many bad nights were there before you finally got a clear image? You mean? Collecting.
You know, in the morning you could stay and find out. Oh, this is terrible. Oh, OK.
All right. How persistent did you have to be?
How persistent?
Yeah, before I finally clicked, I went, I got something. OK, so what you’re talking about is mainly due to weather. So this is astrospheric and I haven’t looked at this before I came.
Oh Well, I you sent me this link and I lost about 30 minutes of my life clicking through So I’m looking for these kind of windows, but all night And so I would say if I’ve got a good like this would be a bad night I wouldn’t even try right, but if it’s clear and that first said it like I’m watching it on that on that kind of command module thing and I’m watching everything and I’m like okay it looks good and I walk away it’s usually pretty the likelihood is really high that in the morning it sits there.
The other thing John is doing it more than one night so okay that works and maybe I’ve got like three or four clear nights which is amazing around here and getting like 30 hours.
That’s when you didn’t move the tripod and so forth, right? Well, I never move it. Because if you move it, then it’s like, I gotta get it all aligned again. No, like I told you, like all I do is I unhook the wires and cover it. How do they recommend that? I have one co-worker and draw shoes.
Yeah. I retired a while back, maybe five, ten years ago. moved to, he found the darkest place in Alabama he could find and moved there and then built his own little, it’s a kind of, it mostly to protect his telescope so he doesn’t have to take it down and put it up again.
He’s got the thing that rolls over and that’s, this is what he does full time now that he’s retired.
I may have to see if I can find him again and send him your, send him your charts. Your telescope is like a concrete base. Nope, it’s just on, I think I’ve got a good picture.
Yeah, I was wondering about that too. Was there just a crowd? Well, they don’t really do rocket tests. So it’s just on the ground. They don’t do rocket tests at night that much. I was wondering about that too. Oh, what about the earth itself? My house shakes when they do react like that.
Have you done any other log exposure experiments now that you have this framework together about doing like 10 bowls? type regular photography but then running it through this. I’ve heard of that. No, I haven’t tried that. It would be interesting to see just what you could get from some very long duration, long exposure, running it through this just for the first.
The only thing I have experimented with is the sun. And finding a solar filter that was big enough for the tube.
I got that and put that on and then got that really big sunspot that we had a while back.
I got a picture of that.
Yeah, but I wasn’t very good at that.
That would be an example of maybe it being very blurry. Doesn’t that move a lot?
So your windows got shorter as far as making a clean picture? Well, these images are what?
5700 light years away and the sun is eight minutes.
So it’s much easier to get planets. And the sun, I would tell you that that’s a totally different process.
And so you would need actually maybe likely a whole nother setup to do solar. I cannot remember his name, but there’s a photographer I follow on Instagram and he specifically does shots of the sun. But it’s filters setting it up on a tripod, getting it oriented a certain way. And he’s still just post-processing, but it’s always just… Quick exposure and then because there’s so much light coming in that like the sensor is just gonna die on his camera before it gets anywhere Land is there like a hundred milliseconds and this is like a two-minute exposure And I even looking at like the full name that’s human eyes through the power of telescope you have to be careful. Don’t you?
Yes There’s certain software that comes with the cameras, the CWO cameras, those are Chinese cameras. I don’t know if you guys are concerned about anything like that, but CWO is excellent for photography, fashion photography, but you can basically check some boxes when you go to point at the moon and it will quickly just change all the resolutions and everything. And that way you’re not like burning out your camera or your eye. This username is thankfully very easy.
It’s cosmic underscore background on Instagram. I was thinking it was going to be something a lot crazier. You try different kind. I know some people doing normal photography will go get a used digital camera and then have it flipped over and now it’s an infrared camera. Because that’s a sensor it picks up on stuff. You just remove the part of the filter that filters out infrared. Yeah, and you can get some really interesting Images off that. I don’t know if I don’t have actual photographers do anything like that or not I think my first thought is JWST, right?
How they said is the aperture of the red.
And then, can I collect enough light?
It’s just different wavelengths of the same light.
Yeah.
Um, I don’t know.
The astrophotography cameras also have that filter on back to do temperature regulation. Okay. You can give the sensor exact temperature of the entire time.
Wow. Yeah. Super precise. It’s… negative 10 Celsius all the time.
Speaking of the kind of problems, like most of my background in photography was like dark room and manual.
I was just thinking this would have been terrible with. But just a quick question. Who here has actually been done dark room stuff? Anybody here?
So weird.
Yes, I bought some part time stuff from you. Not sure if that has anything to do with AI. As far as the need to tinker with things is probably pretty high. What crossed my mind is the filter that you have to… I think it was your noise filter. The one that I normally use would wind up removing your stars because on my film those look like dust. Yes! Yes. So, yeah, my dust filter is your star remover. Oh, yeah. Same algorithm. Yeah. I mean, it’s an interesting thing, though. Yeah, I know. I don’t know. Yeah, the stars are actually how you get the image to begin with. It has to lock in on stars. So planets are actually really hard, like Jupiter, because there’s nothing to lock on. It’s just a object to keep it still. Because you’re too zoomed in. Because you’re too zoomed in. Well, you’re zoomed in, but then it’s so bright that you can’t see the stars. Oh, wash it out.
Wash it out. Have you ever looked into, I think I’ve got this right, somebody with Google or a chat GBT, correct me?
The actual algorithm that they started off with for fingerprint detection.
You know where they came from? Looking at stars and pattern matching.
What?
The plate saw thing.
That thing. Yes.
You find different pieces of the fingerprint.
This is, I know this is this pattern.
Mm-hmm.
Just like how, if you had this algorithm, I’m looking at these stars.
I know this pattern.
I know where I’m at.
I know… Yes.
I don’t… I don’t remember when that was.
Anyway.
So I still felt really silly about this.
This is… one of the things that took me so long was I was getting all of this integrated all the hardware all the all the optics software and I finally found Nina and it had the plate solving with this fingerprint algorithm in it and I it said like plate solve the object and I was like okay I guess I’ll try this and I could not get for the life of me the object in the field of view very regularly I had to kind of search around every night to get it And I finally realized that I could just use this play solving. It basically said, you need to go add this database and this piece of software. And I’m like, oh god, one more. OK.
So I add this in and click the button.
And now every night, I type in my object, I hit play solve, and I’m there.
But that took me six months of trying to get everything working. The weirdest thing is way back in the day, folks on ships that were out in the sea would know. No landed site. That means how they actually navigated was looking at stars because they were the same place.
Well, okay So this is actually my astrophotography laptop that I use out This is actually me now, but I’m hoping that I can go in you and show Yeah, so there are a lot of plugins that you can go try that I am really excited to go do but of course I’m always taking pictures of things or I’ve got some latest product to work on. What is the discord alert notification?
Is that so it can ping you when the picture is done?
There’s so much stuff out here.
There’s Exoplanets. So this is one that I’ve been really curious about.
So you can point it at a star and then the light from the star will be blocked by the planet and then the flux of it is going to drop and then you can actually detect exoplanets with your telescope.
I’m like, what?
This is cool. But yeah, good point about the Discord notifications. There’s also like a comment one that I’m trying to remember. It sounds like you need an experimenter agent. You go run through all these things while you’re at work and come back and say, here’s something decent. The rest are garbage. Yeah, right? This didn’t work. I actually tried recently. Somebody at work was like, can you take a picture of a comment? And like, OK, I’ll give it a shot. And downloaded this plugin. I’ll probably have to go look for it. I couldn’t get the database to work and you’re right. Like there’s, there’s definitely like an experiment level. It’s going to be a good question. Like you mentioned the sunspot, getting a picture of that. It’s like, you know, other, obviously we use this atlas and all that. So, but I figured you probably have the same problems with that as any other solar system.
Like if you try to capture a three eye atlas.
Yes.
Would it probably be as difficult as a planet or?
3i Atlas?
No, that’s the new interstellar.
Oh, like the International Space Station. Like a satellite? No, no, it’s the… It’s a comet. Oh, right. It’s the atmosphere. That was, I think that was the one I was trying to get. And then, or one of the… There’s multiple atmosphere. And then… Yeah, it’s all about… So this is how you would update it.
Specifically, I guess you pull from the Jet Propulsion Laboratory in California.
You pull from there. I didn’t need to use the asteroids. I’m not familiar with the James Webb telescope. And then loading the current solar system bodies.
And then at that point, I think I was able to select from here and select this comment.
And then I can kind of show you what the blitz on it looks like. So once you type in the object here, you’re going to vote it.
And then you’ll do like that. I know this is offline.
And then there’s the solution.
And so then it kicks off the whole blitz off.
And that keeps you, that gets you where you need to be the blitz. So I know things like satellites and others planes can be frustrating But if you got me me that we’re actually like well that messed up my image was totally Yes, yeah, I regularly get starlings They are super bright It looks way brighter than a plane and it’s like huge streaks through your image It’s really cool. I do find it fun. I go back to Stellarium and try to figure out which one it was because I’ve captured it, right? So now I know my window of time, which one was.
So Stellarium has like a plug-in to where you can just say like, I want to look at satellites.
One really cool thing I’ve seen other people do and I’ve never done it is the International Space Station, which is why I thought that’s my favorite question list is. One of these plugins lets you take a picture of the International Space Station, but you’ve got to time it right. And because it’s going overhead and you’re trying to get it as much as possible and taking really, really quick snapshots, you end up with a three-dimensional video because it’s rotating through the sky. It’s really cool. I can’t remember the name of the app or the name of the developer, but it’s like this super vague one that’s in the Microsoft store because it’s literally just satellite tracking.
But I like that one a lot because it’ll give you a countdown for the next pass.
And in relation to where your lat long is when you set them up.
If you go through the NASA websites, that tells you, you go to the NASA website and put in your zip code, and it will tell you when it’s going by. I used to see a lot for like, I’m looking for different NOAA passes because I was trying to pick up stuff on my antenna.
And I was like, OK, I’ve got 13 minutes till this one. I’ve got an hour and a half till this one.
multiple tracks and then being like, oh, ISS was over, I can’t do Jack about that. Speaking of satellites, apparently my reception’s so bad and here I’m now on satellite connection with my phone.
Oh my goodness.
But that works.
We went up in Tennessee where there was absolutely no cellular or anything, sent a text. This remote copy is extremely helpful. This pushes all of the images after the night onto my house. You know what would be kind of clever is showing you how it has this work. So this initially, this whole setup is a sequence for the hydrogen oxygen sulfur for just a general object and if I’ve got Stellarium up or um this other this one’s also I’m just starting to realize I don’t have very good internet out where I’m I’m working and so I never thought that I’d be in a position like this where I actually have in it but um so if I select that and then I can pull from that piece of software and then it updates your right ascension or declination or just think the reference frame of the stars.
So I’m not going to kick this off because I’m not connected to anything that’s what these arrows are, but basically you wait till after dark so I can start this like at five when I get home and then just let it wait.
Then it excludes and centers and that’s that plate solving that Jay and I were talking about.
This is the guiding, so it’s using that PhD Guider to kick this off.
And then at that point, you’re kind of in a good tracking position. You’ve kind of like got the tracking squared away. So that’s when I move into this piece, which is the same.
So I would also click this button and grab the target object.
Then you’ve got the cooling.
which is the other gentleman mentioned is like zero Celsius and keeping the camera really, really cool. I’m still waiting just in case, because you never know if like you’ve messed with something.
I’ve done this. You just, I’ve always had like these safety constraints put in, like don’t go below the horizon kind of things. Things that, you know, I’m asleep. Oh, those are the grounds.
Yeah.
Okay and so slew again, once again more safety things really. Okay so how the telescope works is you’re slewing until a certain point and then you’re going to have to do a flip because you’re a telescope and you’re in a different reference frame, you’re in the reference frame with stars. So you actually have to do a flip and that’s this meridian flip here and that’s all built in. It’s built into the mount. This software just kicks it off when it’s time. It’s like Hey, I’m in within these constraints and I’m right overhead.
I’m going to stop doing everything, pause and do the flip and then turn everything back on. It’s very nice. And then this is a filter change.
So this is a autofocus after filter change.
So what this means is if I’m going from oxygen, hydrogen, sulfur.
I want to double check that I’m still in focus, right? And then as it’s that precise, right? So this triggers once a filter’s changed.
And then again, another safety constraint.
Some of these are from experience. OK, and then this is the key.
So dithering is about noise.
This is your hydrogen oxygen sulfur.
So I basically take two minute exposures. I take 15 of them each and then I flip to each one and this is just a loop. It’s a for loop Actually, I know most of you guys probably so that’s that’s what it’s doing the reason I’ve done it this way instead of like say take a bunch of hydrogen for two hours and a bunch of oxygen for two hours and a bunch of you know sulfur for two hours is you never know if you’re going to hit the tree, if the clouds are coming. You don’t know anything.
So this is a very generic setup. So I just keep flipping, even though it takes time to do like focusing and other steps and ready and flips and everything. I’m just trying my best to get that data. But this is a really sweet piece of software. This is what allows you to go to sleep. And this is the whole end process.
So once everything’s triggered off and like it’s, it knows that the sun is coming up and it will, it’ll stop. This will warm the camera and slew to the center point or home position and then run that auto focuser. I did this right at the last second so that when you walk away from the, from the whole setup, it’s still in focus when you come back the next night. I can’t tell you how many times I’ve come back and been like, I don’t know why anything’s not working because like it’s not, it has to be very precise. What does the image stacking process look like?
Is it taking multiple and then stacking and then combining that and do a single?
I wonder if I have something.
I do. So this is a raw image.
And let’s just say I’m just checking.
So this is the deep sky stacker.
So I would go to those images.
And let’s just say I do like fine.
I don’t want to do, I don’t want to have you guys say your bunch. So these are light frames, hydrogen.
It’s detecting a lot of things in here.
But I just hit register. I’ve got a lot of presets down in here and whatnot.
But then I just hit OK.
And then it just stacks it.
So it’s pretty nice.
And I’ll show you in a second what that kind of comes back is and that’s that raw image that I kind of started you guys out And that’s what I mean by see if I have But yeah, that’s the new sky stack. So is there a thing you want to take the shot of within one view, or do you ever have to take multiple pieces and stitch together? I’ve done that.
That’s a multi-panel.
OK.
They know what they call it an astrophotography.
Oh, OK.
There’s this option.
So let’s say M31 happens to be a really good one.
because this is a very large galaxy.
Let me see if I can show you this. It’s like, I just, I don’t want to, I think you guys wonder what I’m talking about. It’s this, this is a very common image that I’m pretty sure most of you are familiar with, right? And you guys can notice that the colors are very different.
That’s based on somebody’s palette, right?
So in this case, to answer your question, That’s what you do.
Okay. And you hit those and then add the sequence.
Four different shots.
Okay.
So, but think about what that is. That’s four panels.
Three different colors.
Yeah, 15 minutes.
15 minutes, six foot. Yeah. Yeah. It adds up. You’re praying for clear skies. And you play for, this is one of those things where you have to have multiple lines. There’s no other way.
And I’ve actually done this image and I couldn’t get the color working. So I only have a black point. But yeah, you would just add it to the sequence.
I probably want to clear what I had and show you guys, but you just add it and then it just takes care of it. It just says like this is the positions. This is the overlaps and all of that.
And so you don’t have to take care of like trying to get that.
Right. So it knows what your overlap is before you even take the shot.
Right. I’m usually working backward where I have the shots. You have to match it.
And I have to match up. But it has to know where the edges blend over and all this.
Yes.
Your software does that.
But it’s doing the opposite of what you have here.
Correct.
Because it’s three letter agency stuff. No. Okay.
No.
We’ll work for any TLAs.
That I know of.
This is probably the best way to do it, actually.
Let me tell you.
Yeah.
Oh, man.
So how big is your NAS?
How big is your NAS system? 4 gigs times. It’s a RAID 6, I think. So it’s something like 6 gigs. Six, six terabytes. Sorry about that.
Four terabyte drives. And it’s like a certain range that you end up with six terabytes.
Okay.
I guess we need to keep backups.
Uh, yeah, we actually have two gases.
There’s one that’s a backup. You spend a whole night getting one photo. Uh, yeah, it’s… Yeah. Get a little attached to that. Yes Any big goals you’re working towards now you haven’t gotten The planets I would I would really but that’s a whole other setup, right? You know, they don’t want to like invest all my time and energy and money money into that the panels is a big thing. I recently tried to do that and I haven’t even had time to process that. The different comments, I would love to detect an exoplanet, explore those plugins, right? And see if there’s something else in there.
How hard is it to travel with this?
How much time would it take you to set this up in a new place?
I mean, like if I was like, I’m gonna book an Airbnb and pick it up. Yeah. Well, the telescope is 35 pounds itself. Okay. But that, so this is a physics problem. So if the telescope is 35 pounds, then you have to have a counterweight that equals that and more.
Right.
Because you also have all the equipment that’s on the telescope. Right. And everything has to be balanced, which is a whole thing that I didn’t touch. So That’s why I leave it. That’s why I leave it without moving it. Now, what you’ve hit it at is my backup slides. If I wanted a mobile one, they make a mouth that doesn’t have the weights. And if you chose a totally more landscape-based, which is this too. Then you could probably get it way more mobile meaning maybe the total is like 35 pounds altogether, right?
Much more expensive much more expensive. Yeah, it’s like 5k. Yeah But then there’s also this like $500 option that is built to be helpful I’ve wondered is why not like bring an Airbnb for me and snack this guy He’s all AI based, right? With cameras and tracking. Can I ask him with the left one to actually fly with carry on?
Yes, exactly.
How common are these telescopes?
Like if you walked up to someone else’s telescope sitting and it was the same kind of model or whatever would you be able to operate it for? Is it so specific to your settings and your software and your… Or could you hook up, you know, if they had the same camera module or the same kind of stuff, would you be able to write it? A lot of the drivers are pretty similar.
All the open-source software is pretty similar.
The mounts are like maybe six or seven different ones.
That’s roughly it. It’s not like vast. Okay. Invest all of them or not, who has?
All the hardwood built. A lot of the mounts are built here. That’s really cool. The cameras, the CWO cameras that I use, those have become really popular. There’s CMO, CMOS cameras, super cheap and expensive for the most part. And yeah, they’re built in China. I’m just thinking you could… build some kind of a compound that you saw charge at true photographers and I did right to come in and use your Telescopers that are already set up and everything will do it. Yeah. In fact, I thought about doing before like getting into all this, right? But this is actually how I take.
If you saw me at work, this would be how I take it. Okay. I get I get into like, how does it really fundamentally work?
I want to know how do I get this data right?
You might have mentioned this before, that this might be a naive question, but with as far away as the objects are, with our perspective points and with how they’re viewing them, can it be close enough that you can share data between potentially a couple of people? Yes, I’ve seen that on YouTube. I’ve seen people from totally different locations sharing data. to get more and more, right?
Because like I told you, like 30 hours gets really good clarity. So they’re just trying to like get as much as possible. I think I’ve heard of that with radio arrays.
Yeah. To try to do larger measurements. But you’ve seen it in optical?
Yes.
Well, I mean, the Earth is moving pretty fast. So even from one night to the other, you’re going to be in a different position anyway. Yeah, but in this reference, because you’re in the reference frame of the stars, it doesn’t, in this, in this sense. No stereoscopic information from those two different points, right?
They’re just way too close together, like too many million miles away.
Correct, that’s the parallax. See a little more separation.
That’s cross there maybe. The only time that wouldn’t be right is when I talked to you about the International Space Station, right? It’s close enough that you can get some sort of rotation to it. Were you able to do anything with the last solar eclipse that happened a couple of years ago? I came home in the middle of that and it was cloudy. When I got home it was cloudy. I went out there, I had it all set up. I’m sorry. You can get a comment from Sarah. Thanks for sharing. You may have created some more astrophotographers. So you have remote access too, so you can start working on. Yes. I have a remote desktop, but then I also have the security cameras.
So what I usually do, is if it’s up for several nights, like before I go to work, I’ll go out and check on everything and start it.
And then it’s got that timeline, it’s got that time wait, right, for me set in there.
And then like, let’s say I want to go out to dinner with my husband or go out with my family or what our friends are going to do. I can chill in on it on the security camera and just make sure that it’s doing what I expected. But the remote desktop is really important because it’s cold.
Now in the summer, it cools off, right? In the evening, that’s not so bad. But it is cold. If you have a problem, I literally will consciously be like, I think I fixed it to the point that I can go back to the computer. This is not a cable. or some other problem that I can’t fix. So I’m like, there’s ways when you’re trying to get the telescope in the reference frame of the stars.
It’s called polar alignment.
When you’re doing that, you have to physically be here.
So if it’s set up for multiple nights, you’re done, right? But yeah, there’s times where I’m like, oh, if it’s not right, the alignment when I’m checking the guiding, when it’s way above one, I’m like, oh, I gotta go back out there.
It’s cold.
What’s the power source around this all the time?
Batteries, or are you running this on an extension cord?
An extension cord, but there’s actually solar and batteries at the shed. I’m actually, there’s a shed nearby that I’m folding from, but it’s not, there’s not power. There’s no shore power here. So we have like a backup system and I’ve literally run it on the sun.
Right. Okay.
But like larger lithium batteries, scores, and I’ve had issues personally in the cold with batteries that stopped working because of that cold. Is that cold? Yeah, but mainly the extension cord is the backup. Okay. Disrunning from the house. Over.
Is that cold?
Like an anchor. Yeah. Okay.
It gets ego from the action. It’s one of the shit if it doesn’t get too cold. Yeah. Yeah, and then they have temperature regulation. Well, I know it’s a security fire outages. Yes, that’s the key thing there. Yep. Yeah, you can yeah, and then the laptops on battery, right? Correct. Last one to be happy with.
Exactly.
No kidding. after all the setup. The tracking mode is of course moving. Keep behind. Yes. Yeah, and you don’t want to interrupt.
Agreed. We’re going to force reboot.
So, okay.
Going to reboot in 15 minutes.
You’re going, no, cancel, cancel. I love Nina, but it’s allowing Windows. A lot of the software is Windows based, not Ubuntu, not Linux, which is my strong suit, right? And what I prefer. Yeah, so Windows will regularly update the .NET driver without asking.
And that impacts Dina, and then it impacts the drivers on the cameras, and that’s why I told you there was a dependency help, because all of that has to work. So I’ve gone in and tried to shut down Windows, basically, but you don’t update, don’t touch anything. Is that even connected, is it? Um, out there it generally isn’t.
I have to download databases and everything offline because I haven’t internet ones.
That would prevent that. Mm-hmm.
Yeah.
We’d have to run a cable out there, right? Do any other questions?
I’m also watching online for those.
If you keep on zooming in using the AI tool, did you mention it got weird and I was wondering if you have any examples or anecdotes on there? I thought about pulling one. So you see how it’s like doing these tentacles?
Like that is not.
Like, I thought, why does it infer like that?
You know?
Some of it’s a weird thing to think about is these models were trained on a lot of video kind of stuff.
Yes. How many videos a star do you think they looked at?
Yeah. So, you know, this might be some of it.
Well, I’m closing this to all of you.
What do you guys think if I find two?
I looked it up, I think that can be done.
But we have Hubble and NASA, right? Here’s my input, here’s what it should look like.
I thought that would be a fun project, right?
See if I could get this to stop. any video model if you keep it running a long time.
It’s like every frame okay this is what the style is I think it should look like this and then each one as it slightly gets off on the style it has that new reference point saying okay it should look like this and keeps building new reference points and the longer you go you can get way different. You can start off with something that looks real life and then ends up like 2000s CG kind of stuff. One thing I’ve seen it do, the diffusion models, I don’t have an example of this thing, is it likes to make them explode.
I don’t know why.
That was my first like, maybe I should try to do this. Special effects. Star Wars, that’s right. Oh, okay.
I think it was drained on.
If you were to think the general public using this 1.2.1 model and they have something that looks like this, what are they going to be trying to do with it?
You know, you’re probably not thinking, hey, let me look at a Nebula in three dimensions and have a turn.
They’re probably 16 years old trying to figure out how to make their YouTube thing look cooler or something.
I don’t know. one, I did one where I purposefully wanted to explode. There’s actually a bubble nebula and I said, make it pop.
And it, and then I was like, make it pop and have an asteroid hit it. It did the crazy, I could not stop laughing. The bubbles there, the nebula bubbles there, the asteroid comes in, bounces off of it, and then it explodes. Bounces. Whoops.
Yeah, most of this, it’s really what all does.
There’s Looney Tunes going on. Oh yeah, Roadrunner. A lot of these have been trained specifically to work in like real world kind of situations where they got something, I mean like the old one where you’d have a guy on a sidewalk, a car goes by and the guy still needs to be on the sidewalk after the car passes. So it has to have some kind of a temporal, you know, whatever. But when you’re in space and things, you know, you should fly through, right?
That’s not a normal, you know, I mean, it’s a perspective problem, right? It’s not trained on that perspective.
Yeah. Is the mic working here? Yes, Josh, go ahead. I think this would be a good time to plug our series week, which is on world models, which is how do we teach models to know how things work instead of just sort of like what’s next. It’s exactly about what you guys are talking about.
And we even talk a little bit about Sora and Juan specifically. So some of that stuff, if you are interested in this and learning more, we’ll go into that. Cool. Sweet, Josh. Thank you. That is next Wednesday at six virtually.
So keep.
You guys open it up. Any more questions? This is just mesmerizing just why. I really like how a lot of the stuff is open source and available. I think that’s probably why the community is where it is there. Just because it’s available to hack on and play with and do things.
I might go see if I can pull some of these, especially if the code’s there.
It’d be interesting to throw some other AI at the code bases and see what you could do to maybe fix dependency hell or do something.
So you haven’t automated your process from beginning to end, right?
Just that need of portion that I showed you.
And then leaving everything up. But no, I haven’t like like innate in or something where you run this model and you run that model Not yet, but I think I do that You gotta be quite the image Well, that’s so if you will hit more and figure or figure out how to stop recording.

