Rosette Nebula – A gate to the immaterium? 🔭

It is astroimaging season, and we have had several clear nights here. I finally worked out how to collimate my 8 inch newtonian well enough to use my Starizona Nexus coma corrector/reducer.

My Skywatcher 8″ Quattro together with the Nexus becomes an F3 system, which is a quite powerful little telescope. It requires very precise collimation and focus. This can be very challenging to achieve. I have struggled with it on and off for a year, and finally got it working fine.

I think these were the main issues. If you face something similar they are worth checking out.

  1. Primary mirror clips were a bit too tight – Sometimes (I think this was temperature related) my stars would look oddly off. Once I made a bit more room for the primary mirror the star shapes became nice and round.
  2. Secondary collimation with an badly collimated laser – I thought my Hotech laser collimator was well collimated based on rotating it in the focuser. Turns out it is very important to rotate and then fasten it to check the lasers collimation. It was not badly off, but enough to make it very hard to collimate at F3. I collimated the laser by repeatedly rotating and fastening it in a two inch eyepiece holder, marking the laser position each time. Then collimating towards the center, and repeating until the laser point is stationary when rotated.
  3. Secondary collimation drifts a bit – The secondary collimation sometimes drifts a bit over time. It seems temperature related, but I am not sure why it happens. As long as I check before imaging it has been fine so far.

Pics or GTFO!

Once this was working I conveniently had the Rosette Nebula drifting by the balcony for a few hours two nights in a row. I imaged approximately 40 min of Ha, 80 min of OIII and 70 min of SII using respectively 120s, 300s and 180s exposures using my ASI1600MM camera, which resulted in these two images based on the same data.

The Rosette Nebula in SHO. Stacked using Astro Pixel Processor and stretch in Pixelmator Pro. It was not a very noisy image, but I did an ML denoise and a slight sharpen to deal with some noise at the edges. This version is slightly cropped due to rotated OIII images not aligning with the Ha and SII.
Same data with nearly no crop, doing a HSS coloring.

I like the SHO a bit more, but the HSS version looks really menacing and stormy. If I was a W40K artist, this is how I would depict a gate to the immaterium.

I think this setup fits the normal seeing here in western Norway a bit better then the basic F4 setup of this telescope. As mentioned it requires very precise collimation though, so you either need to be experienced or be prepared to use some time to get everything right.

I really hope I get some time at a nice dark site with this setup soon.

Using NetBird with my reMarkable 2

drop bear
Should not have bought that drop bear repellent 🤯, better let him chew on some bits…

No drop bears or reMarkables were harmed making this post, but yours might be if you do nonsense like this. At least take a picture of the reMarkable ssh details before you start and proceed at your own risk.

I really like my reMarkable 2, but the default setup has some shortcomings. The cloud connect thing works nicely, but I want my backups in one place where I am in control. Currently that is my Synology.

My current setup is that my devices connect to the Synology using SSH keys. I do not want my Synology exposed to the world, so my devices are in a Wireguard network with the Synology. The setup of connections is managed using NetBird. To connect my reMarkable I would need to install a NetBird client on it, add a SSH key, then start using rsync to send files to the Synology. Should be easy…

NetBird client

The reMarkable cpu has an arm processor, and runs a version of Linux made for the reMarkable. It turns out that device supports armv7, so the NetBird client can not be the armv8 one, but rather the armv6 one. NetBird had me covered.

TUN

Once installed it seemed to connect fine, but then it stops working, and netbird status said “missing tun” or something like that.

I had experienced this before, so I tried to enable the kernel module tun. Sadly there is no tun on the reMarkable, an issue in reMarkable linux explains this.

Challege accepted! So began an adventure in cross compiling the tun module. Checking out reMarkable Linux got me the branch rm1xx_5.4.70_v1.3.x, which I eventually figured out was the latest release, and the correct branch for this.

Then I needed a cross compiling toolchain. Initially I got a wrong one, and builds would work, but tun would not install on the reMarkable with some really weird errors. Eventually I got the correct toolchain. It can be installed on Ubuntu doing:

sudo apt-get install gcc-arm-none-eabi

With the toolchain in order, Linux build configuration was the next hurdle. At first I struggled with full menuconfig, but a much easier way to configure Linux for the remarkable is to use the existing remarkable 2 (arch/arm/configs/zero-sugar_defconfig) config. I duplicated that and added CONFIG_TUN=m. Then did:

make ARCH=arm CROSS_COMPILE=arm-none-eabi- my-zero-sugar_defconfig
make ARCH=arm CROSS_COMPILE=arm-none-eabi- drivers/net/tun.ko

Got the tun.ko and smashed it into the kernel using:

cp tun.ko /lib/modules/5.4.70-v1.3.4-rm11x/kernel/drivers/net/
echo "kernel/drivers/net/tun.ko" >> /lib/modules/5.4.70-v1.3.4-rm11x/modules.dep
depmod
modprobe tun

And it worked! netbird status said connected!

Dropbear SSH

I added my existing backup key to the reMarkable, and got “SSH error something something”. SSH is also weird, I can not add -vvv to my SSH commands 🤯. ssh -v Dropbear SSH 🤯! I guess I should not have bought that drop bear repellent 25 years ago.

Dropbear SSH seems nice and lightweight. It really wants its own keyformat thing, so I used dropbearkey to generate a new key. Like this:

dropbearkey -f ./id_dropbear -t ed25519 -s 256

Added the printed public key to the Synology, and rsync finally works!

No cron on the reMarkable though; sigh; off to learn systemd timers…

On the streets of Facebook

Facebook is the only social network I use for anything serious. I wish it was more like the forums from 20-15 years ago, but I digress. Lately my Facebook got filled with more and more Suggested for you nonsense. Previously I used the hide feature on those, and that partly worked for a while, but now it just leads to more and more fringe stuff appearing. Eventually I conjured up some energy to deal with it.

What Stable diffusion should have come up with for the promt: Facebook Suggested for you on a good day.

I do not trust random browser plugins, so I decided to look into writing it myself. Firefox makes that very easy. This small snippet made Facebook acceptable again.

//Stupid flag to only run removal after some time, and not on each observed update
var mut = false;

//Get rid of "Suggested for you", probably needs customization everytime FB adds more divs ;) , worked on 19 Mar 2023!
function removeShit() {
	if(mut == true) {
		for (const span of document.querySelectorAll("span")) {
		  if (span.textContent.includes("Suggested for you")) {
		    console.log("Hide all the nonsense")
		    //The hardest part of all this is counting all that nesting, what is going on here.
		    var pr = span.parentElement.parentElement.parentElement.parentElement.parentElement.parentElement.parentElement.parentElement
		    if(pr.style.display != "none") {
		    	pr.style.display = "none";
			}
		  }
		}
	} else {
		//console.log("No crap seen yet")
	}
}

//Observe the entire doc
const targetNode = document

//Observe all the crap
const config = { attributes: true, childList: true, subtree: true };

// Execute on observe
const callback = (mutationList, observer) => {
  for (const mutation of mutationList) {
    if (mutation.type === "childList") {
      mut=true
    } else if (mutation.type === "attributes") {
      mut=true
    }
  }
};

const observer = new MutationObserver(callback);

// Start observing
observer.observe(targetNode, config);

//Interval to check for site changes and remove stuff.
const intervalID = setInterval(removeShit, 100);

As always, use at your own peril.

Facebook will still spice it all up with the ineffable timeline ordering, and still eats all the datas.

Remarkably awesome!

I love my reMarkable 2, it has worked great for all my needs with one exception. It was hard to use for mathematics, since it was hard to expand pages and have the context be preserved. I have been asking for this feature in all my feedback to reMarkable, and now it is finally available!

I tried it a bit last night, and it works great. I am really looking forward to try it more over Christmas!

Improved astrophotography capabilities

My new equipment

After deciding to buy a proper equatorial mount (an SW EQ6-R Pro) with an astrograf , imaging has become much easier. I have also finally learned how to do collimation of mirrors well enough for it not to be a complete nightmare. It is still pretty nightmarish in the cold and dark, but what isn’t.

Guiding (using this camera in my finder scope to track stars) also has helped imaging a lot, since it partially compensates for polar align not being perfect, as well as allowing really long exposures if needed. From the city I do not see polaris, so I just do a very rough polar align and hope guiding deals with the tracking issues. I should probably learn to improve polar align based on mount tracking errors, but it seems like such tedious procedure. I have yet to find the energy to learn it.

The quality curse

One nice thing about lucky imaging DSOs is that most non tracking related image defects are not really a problem, since they are swamped by tracking issues. I was super happy if I got halfway decent data. Now with proper tracking I get really annoyed by tiny technical defects.

Anyway here are my best images from this year so far. I had a lot of fun taking these and editing them. They are all taken with only dark calibration frames, and either using fake flats or no flats. AstroPixelProcessor is magic, all the DSO images are stacked partly post processed using it.

The images

M31 – Andromeda galaxy, as well as M110 and M32. Approximately 2 hours of 180s exposures at 1600 ISO.
M42 – The Orion nebula. I think I managed to nicely balance the very bright core and the rest of the nebula. A mix of 30s and 60s exposures at 800 ISO. In total 1 hour of exposure. Then some creative editing to keep some detail in the core.
Moon, with increased colors saturation to show the different shades of gray reflected. This is 130 images stacked, from a series of 1000 images. It was stacked using PlanetarySystemStacker, which is very neat for us Mac users who otherwise do not really have any great native stacking applications for planetary images.
M45 – Pleiades. This was a hard edit since it was taken from bortle 6 skies with no filters or anything. UGC 2838 is also visible in the upper right part of the image. That is one pretty faint galaxy!

I am bitten so hard by this hobby so expect more astrophotography related content. I will try and post other stuff as well, but this is way too much fun.

Blinking cepheid variable stars in Messier 13

Something that feels like a once in a lifetime event happened here in Bergen this spring. We had four consecutive days of clear skies, with very little wind and no moon!

Earlier on the blog I have expressed the desire to try and image Messier 13 on consecutive nights, to see if I could see the difference in magnitude in the variable stars in the cluster. Especially the variable star V1553 Her, which has a very convenient period of approximately 5 days. 4 days of observing would get me most of the period.

These 4 days were pretty late in spring (30th of March to 2nd or April), and observing had to be done after 10PM in a work week. Not ideal. Thankfully except from the usual tracking problems, there were few problems with the equipment. My own patience failed while doing focusing though, which resulted in two nights with good images, and two with not so great quality.

This was a bit problematic since I wanted to compare images over time, and since I had 40 good images from one night, and 10 from another there was a big difference in the brightness of the stars once I was done stacking images from each night. To account for this I did some brightness matching in post processing matching on the non variable stars of each frame.

The resulting 4 frame time-lapse looks like this.

I was pretty happy with this, as it shows the variability of not only one, but at least one more, and maybe even a third variable star I missed at first (Can you find it? This paper has charts to help). The change in brightness especially for V1553 Her was also much clearer then I expected. Great success!

The process to find the distance to the star from the magnitude and variability data is neat. Since V1553 Her is a Type II Cepheid (how this was determined is not clear to me, please add a comment if you know) which period is approximately 5 days the star has an absolute magnitude of approximately -1.5 according to this chart. The formula below (from here), should then give the distance \(d\).

$$ M_v = m – 2.5log((d/10)^2) $$

Looking at the images and the reference stars, a rough estimate would be that V1553 Her varies between 12 and 13 in apparent magnitude. Plugging in -1.5 for \(M_v\) and 12.5 for \(m\) gives the distance of 6310 parsecs, which is 500 parsecs off.

While I have not followed a very thorough process here, it is anyway nice to verify that my data seems to fit to reality. To get better data for deep space objects, I am evaluating to get a solid equatorial mount with either a small refractor or an 8 inch newtonian for imaging, and keep my dobsonian for visual and planetary. Once I do I hope a can revisit this project and get a time-lapse with much higher quality. Maybe do one which would also capture the really fast variables with down to 0.2 days period.



Using Jetbrains products in a VM and on the host at the same time

I have been hacking on some libcamera experiment, and this time I had the misfortune to have to try and use a Jetbrains product in a local VM and on the host system at the same time.

Since it is practical I was using UTM shared networking, but that had the unfortunate side effect that Clion in the VM and IntelliJ on the host detected each other and refused to both be open at the same time. Incredibly annoying…

Googling I figured out that if you use the same username in the VM and on the host it will all work out. I got more and more confused as to why this feature exists, but a glimmer of hope emerged. I was not ready to change user in the VM or on the host though, since I had used some time to set up that user.

Thankfully, this detection feature uses the JVM system property user.name, to lookup the user, so adding -Duser.name=<host username> to the clion64.vmoptions file made it all work out.

Major annoyance averted!

I ❤ globular clusters

I love observing globular clusters visually. There is something very satisfying about turning the focuser to try to resolve the maximum amount of stars.

On the 12th of march I did an observing session in moonlight and quite a bit of wind, and I finally learned why astrophotographers dread wind. The telescope kept shaking, and had to park my car in front of the telescope to get it to be usable at all.

That had the sad effect of blocking most of the sky I was interested in, but M3 and M13 were still visible. After doing some visual observing I did some very lucky imaging in the wind. Of about 350 exposures of M13, about 70 were decent, and 10 of those were good. Stacking those 10 gave me this image.

M13 (Hercules Globular Cluster), image using my 10″ skywatcher dobsonian, a Televue powermate 2x, and my Canon EOS RP. This is 10 exposures of 8 seconds at 12800 ISO, stacked using APP.

The star density in such a cluster can be up to 1000 stars pr cubic parsec. Imagine living there…

Astro observation log 27.12.2021

Finally clear skies and I was at a bortle 2 location. I also had my new Canon EOS RP Camera.

M 42 – The Orion Nebula again

M 42 at a bortle 2 location was great! Last time I imaged the Orion Nebula I struggled with tracking. This time I had fine tracking (for my dobsonian anyway) and I finally got pretty round stars. I got 20 good 15 second exposures at 1600 ISO. After stacking in AstroPixelProcessor and post processing in Pixelmator Pro i got this result:

Around 15 light frames, and 20 dark frames stacked.

NGC 2024 – Flame Nebula

I have tried to observe the Flame Nebula visually many times, and I have failed every time. I was therefore planning to only do some testing photos of the region. I had trouble lining up the finder scope and camera, and I therefore did some visual observing. The nebula showed up surprisingly clear. Bortle 2 skies are really something.

Once back inside I regretted not observing more carefully visually as well as taking several image sequences. The Horsehead Nebula showed up quite clearly in my images, and with the bortle 2 skies, maybe it would have been visible visually as well.

5 lights stacked of the Flame nebula (upper left), Horsehead nebula (lower right) and NGC 2023 in the lower middle center.

M 33 – Triangulum galaxy

The galaxy showed up very clearly, and I got some images. Sadly the tracking worked badly in that region of the sky, and my focus was off. Looking forward to try this again some other time.

Single M33 shot

M 1 – Crab Nebula

I also got some images of the crab nebula. My focus work was not great, and I should use my Powermate next time, to get more nebula data.

NGC 281 – Pacman Nebula

I have tried and never even gotten close to observe this visually before. This night it was visible, and I got few images that were fine. I do not have enough data to stack, but my single ok light frame looks like this:

NGC 869/NGC 884 – Perseus double cluster

This cluster is easy to find, and with little battery left, and tracking that was not working, I decided to do 2 second shots. Stacking those resulted in this, which by far is my best open cluster image to date.

NGC 869 (left) and NGC 884 (right)

Closing thoughts

All in all this was a very rewarding session, and I finally got some deep sky data that was worth it to process. The Orion nebula image turned out way over my expectations.

I am thinking about getting an equatorial mount, and maybe a guide camera, so I can take longer exposures. It was also really great to finally see some of the more difficult nebulae visually.

Here’s to hoping I get a night in 2022 that beats this one.

Adventures in creating a Snap from an LWJGL2 based game

The old Artifact is still kicking, not too long ago I created a Snap for it. Creating the Snap felt a bit like concocting a magical snapcraft.yaml and and hoping it works out. For the first few attempts the magic never works out, and the process of figuring things out tend to be tedious at best. This was no exception. Here are some of the problems I ran into, hope it helps someone else.

Xprop missing

My first Snap attempt immediately spit out this during startup:

java.io.IOException: Cannot run program "/usr/bin/xprop": error=2, No such file or directory

This was fixed by adding these lines to my snapcraft.yaml. The layout is needed since xprop is referred to by an absolute path. See this thread for more information.

layout:
  /usr/bin/xprop:
    bind-file: $SNAP/usr/bin/xprop

parts:
  mypart:
    stage-packages:
      - x11-utils

A xrandr puzzle

Next in line was this:

Exception in thread "main" java.lang.ExceptionInInitializerError
	at org.newdawn.slick.AppGameContainer$1.run(AppGameContainer.java:39)
	at java.security.AccessController.doPrivileged(Native Method)
	at org.newdawn.slick.AppGameContainer.<clinit>(AppGameContainer.java:36)
	at game.Artifact.main(Unknown Source)
Caused by: java.lang.ArrayIndexOutOfBoundsException: 0
	at org.lwjgl.opengl.LinuxDisplay.getAvailableDisplayModes(LinuxDisplay.java:951)
	at org.lwjgl.opengl.LinuxDisplay.init(LinuxDisplay.java:738)
	at org.lwjgl.opengl.Display.<clinit>(Display.java:138)
	... 4 more

This was caused by xrandr not being installed in the snap, which LWJGL2 uses to find display stuff. This was fixed by adding the x11-server-utils package, which installs xrandr.

stage-packages:
  - x11-xserver-utils

For debugging snaps and finding packages these commands were great.

#This allows you to look at the system as seen from the snap
snap run --shell <your-snapname> 

#This will give the package that installed an executable, in this case xrandr
dpkg-query -S /bin/xrandr

I hope this helps someone else looking to get their application in a Snap.