## long exposure – Issues with dark frame subtraction: Dark frames adding “noise” and changing image color/tint

While editing some landscape shots with stars, I tried to use darkframes to reduce the noise.
More precisely, my approach was to take a series of shots, then firstly to subtract dark frames from each shot, secondly to use the mean of the series for the foreground to further reduce noise, and thirdly to use an astro stacking tool (Sequator) to stack the sky.

Instead of reducing noise, the darkframe subtraction:

1. increased the noise- or rather, added some dark/monochrome noise.
2. changed the white-balance/tinted the image.
(see below)
I do not understand why this is happening/What I am doing wrong.

Procedure/Employed Troubleshooting:

• All photos were shot in succession, with the same settings (15sec, @ISO6400, in-camera dark frame disabled).

• All photos were shot with the same white balance.

• While shooting the darkframes, both the lens cap and the viewfinder cover were applied.

• Photos were imported from my Pentax K1ii, converted to DNG in LR, and exported to PS without any editing/import presets applied.

• I used PS, placed the darkframe layer(s) above my picture, and used the subtract blending mode.

• I followed basic instructions found here/in various videos on dark frame subtraction in photoshop. Note that basically, all of those cover dark frame subtraction with one frame (or use tools other than photoshop). I have tried both using one, and 3 frames. The results are similar, albeit more pronounced with 3.

• I used the free tool “sequator” to subtract darkframes instead (and to align stars). Adding the dark frames here made absolutely no difference.

• (This is an edit/composite done with the frames I tried to subtract darkframes of)

• A crop of the first picture, with (3) dark frames subtracted:

• A crop of the second picture, without dark frames subtracted:

## Photo noise level in RAW format

Why is the noise level of raw photos higher than jpeg photos?
I have encountered this issue several times.

## Good Low Noise Long Exposure Astrophotography camera?

Recommendations:

I was reading that Canons were the best at low noise long exposure photographs. Then I found this site, https://www.brendandaveyphotography.com/more/long-exposure-sensor-testing/

It seems to say that some Fuji cameras are better at low noise (half the noise) than Canon for 5 minuite exposures. Which is better Canon or Fuji?

Comparing cannon EOS 60d vs fujifilm X E1 on a second site: https://www.photonstophotos.net/Charts/RN_ADU.htm
Seems to suggest that the fuji has half the noise. Why don’t I see reccomendation s for Fuji cameras as used for astrophotography?

## noise – Very noisy image, looks like stars

Yesterday I did some long exposure shots, and it turned out to go quite well (in my opionion), just one issue is that the image is very noisy like I’ve never seen before. The noise looks like stars, and in the sky it looks okay. However stars on the groud is ugly.

Any tips for fixing this? Shot in 1600 ISO (max on my camera).
Image size is decreased to fit into Stackexchange, hopefully they are still visable

## post processing – How to remove this noise (color patches)?

I’m having images with a visible color noise in the form of patches:

One can reveal the patches with modular multiplication:

I tried to remove the noise with the state of the art denoiser block-matching 3D BM3D (exactly BM3D with collaborative hard-thresholding and the seconds step is a regularized Wiener inversion using BM3D with collaborative Wiener ﬁltering, as described in Image restoration by sparse 3D transform-domain
collaborative filtering
, 2009, Dabov et al):

The BM3D filtered with the modular reveal:

I lost a significant amount of details (shoe, pant folds, grass).

Can someone recommend a denoiser algorithm that would do a better job correcting this noise while preserving the details ?
The image is from a raw file, developped then into a 10bits tiff file, and never went through any jpeg compression, even though, it looks like it did.

## probability distributions – Limit of a linear discrete-time stochastic process with uniform noise

I have posted this in the math and stats sites, but I am not sure where the proper forum for this question is. If it is not here, please go on and delete it.

Suppose we have a stochastic linear process:

$$x_{k+1} = Ax_{k} + Bw_{k} qquad text{with} qquad x_{0} = ainmathbb{R}^{n}$$

Here, $$A,B$$ are known matrices and $$a$$ is a known vector. Moreover, the eigenvalues of $$Ainmathbb{R}^{ntimes n}$$ are in the open unit disk. Finally, the elements of the sequence $${w_{k}}_{kinmathbb{N}}$$ are iid random variables uniformly distributed on $$(-1,1)^{m}$$

1. What would be the distribution of $$x_{k}$$ as $$ktoinfty$$? Notice that the linear recursion above can indeed be arranged as:

$$x_{N} = sum^{N-1}_{k=0} A^{k}Bw_{N-k-1} qquad text{for each}quad Ninmathbb{N}.$$

Thus, for $$x_{N}$$ with $$Ntoinfty$$ we are looking at some form of the
central limit theorem, where the distribution of $$x_{infty}$$ seems to be bell shaped but can not possibly be normally distributed. This, of course since the the distribution has a compact
support—due to the fact that each $$w_{k}$$ is compactly supported and the eigenvalues of $$A$$ are in the open unit disk.

As a plus I would like to know if the family of distributions generated by the sum above, for each $$N$$, has some sort of name (the first is a multidimensional trapezoidal distribution, the second is piecewise quadratic, then piecewise cubic,…).

1. As an example, lets take a scalar system with $$A,B = 1/2$$, and $$x_0 = 0$$. For $$k=0$$, $$x_0$$ has a dirac distribution. For $$k=1$$, we have a uniform distribution supported on $$(-frac{1}{2},frac{1}{2})$$. For $$k=2$$ we would get some trapezoidal distribution supported on $$(-frac{3}{4},frac{3}{4})$$. For each $$k$$, the distribution of $$x_{k}$$ is some piecewise polynomial function compactly supported which indeed looks increasingly bell-shaped. Even for this case, I can’t really figure out what the limit distribution would be.

## iPhone XR black screen and hissing noise

Today my iPhone’s screen randomly went black when I was talking to someone on skype, nothing would work, I tried restarting, etc, plugging/unplugging the charging cable. It was also making a hissing noise while the screen was black.

Everything went away after I used the combination to take a screenshot (volume up button + power button), then the hissing noise went away and my screen started working.

Does anyone know what this was?

## lens – Origin of the sound / noise made by some stabilized lenses?

The sound may be produced by some form of a permanent feedback loop to allow a movable lens element to “float” in the air, but does any one know what is, precisely, the origin of the sound? And why it is permanent on Fuji lenses with OIS?

Without power, the OIS element in Fuji lenses (and some others) is free floating. The OIS element isn’t “parked” and locked when the power is removed (unlike, say, read/write heads for spinning hard drives). Thus, the OIS element has to be held “rigidly” in its optically neutral position (non-stabilizing control) at all times when the camera is powered.

As with any system with active feedback control systems, the controller is still working, even if it was commanded to just stay at a commanded point. By analogy, a Harrier VTOL jet that is asked to just hover 100 ft. above the ground is still working its control systems. The controller is always trying to compensate for error inputs, such as gusts of wind that might move the jet laterally or change its altitude.

Functionally, the only difference between enabling and disabling OIS in these types of lenses is whether or not the feedback control system is trying to filter out (i.e., compensate for) motion frequencies around 20 Hz and lower. ST Microelectronics has an interesting white paper discussing OIS controller design and implementation, that covers this in more depth.

This controller has to be almost immediately responsive to movement, so the feedback controller is operating at much higher frequencies than just the 20 Hz (and below) that needs to be filtered. It’s this controlling frequency moving the OIS element that is the source of the noise. In essence, the OIS element is acting like a speaker cone — a thin, more-or-less planar surface moving to some degree back and forth (although it’s mostly just laterally) pushes air around. Because it’s controlled by electromagnet motors, the moving OIS element is also pushing back on the lens body itself. Some of the motion is in the audible range of human hearing, which is the buzzing sound you hear.

## unity – How to add caves to a terrain mesh generated with 2D noise

Am I getting it right, that when I’ll make a single mesh that represents terrain using unity’s built-in 2D Perlin noise function, then I’ll be able to somehow add caves using whatever (Perlin worms for example)?

Or do I need to find solution with 3d noise?

## unity – How to randomly generate biome with perlin noise?

There are three main steps here:

1. Use some method to assign biomes to regions (this is the hard part, with multiple strategies I’ll break down shortly)

2. For each point in your mesh or tile/node in your world, determine which biome it’s in, as well as which neighbouring biomes it’s close to. Compute an interpolation weight representing the influence of each nearby biome.

3. Evaluate your terrain generation logic for each of the nearby biomes separately. (In practice, this means you’ve computed 3-4 possible heights or other characteristics for this point/node) The final height for this location is a weighted average of the results from the nearby biomes, according to their influence.

It might seem wasteful to compute multiple full biome results for a single point, but this is a necessary evil if you want to get sensible blending between them. If you try to interpolate the generation input parameters alone (particularly the noise frequency or number of octaves) and generate just one height result based on the blended parameters, you’ll get non-sensical looking blends with nasty artifacts, especially when far from the origin. You also limit how complex and varied your terrain generators can be, since they all have to use the same rules.

There are lots of different strategies for step 1. I’ll classify the main families as “zoned” and “emergent”.

For this strategy, we start by dividing our world into a collection of zones.

A popular method for this is using a Voronoi diagram (also sometimes referred to as a Worley noise basis in procedural generation contexts), where we pseudorandomly scatter points across our map. The set of all locations closer to this point than any other point forms one zone, which is guaranteed to be a convex polygon. This gives us a nice structured region to work with that’s still a little more organic than strict rectangles:

(See also this Red Blob Games article for some discussion of how we can tweak the standard Voronoi approach to something we might prefer for world generation)

You can get arbitrarily more complex with your region selection, of course. Maybe applying domain warping to make the borders between regions more organic, etc. Though I recommend starting simple – most of the hard geometric borders will be hidden by the time we do our blending/interpolation, so they might not be a problem.

Next, we assign a biome to each zone.

We could do this randomly, using the zone location/ID to seed a weighted random lookup into a table of biomes. The risk with this is that adjacent biomes make their determinations separately, and so you can get combinations of adjacent zones that don’t follow any coherent geographical logic – like a lush jungle completely surrounded by desert, with no fresh water nearby or flowing through it.

We could also use rule-based zone assignments. Maybe we randomly assign a certain fraction of the zones to be water – lakes, seas or oceans. Then we designate any non-water zone adjacent to a water zone must be a beach, cliff, or marsh. Forests and jungles can appear 2 zones away from the nearest water, and mountains only 3+ zones away. You could accomplish this with cellular automata or other adjacency-based rules.

Then for step 2, we can compute where our point-to-be-generated sits relative to its nearest neighbouring zones, and use that to compute the interpolation weights to use for each biome’s result. One way we can do that is to compute a Delaunay triangulation of the seed points we used to create our zones. This is the dual of the Voronoi diagram, where each triangle represents a junction between three adjacent zones. We can then use the barycentric coordinates of our point-to-be-generated within this triangle to compute the weight to give to each of the three nearest neighbouring zones.

(Diagram showing the relationship of Voronoi cells and Delaunay triangles from here)

Here we try to model/approximate some of the real-world processes that give rise to different biomes, as a way to get more natural / logical relationships between the different structures. The biomes then “emerge” as a consequence of the simulated natural processes that shape your terrain.

Th first step is to generate the underlying drivers of biome formation in our model, like temperature and moisture, to pick a popular pair of inputs. We could generate a moisture map and a temperature map as two separate, low-frequency Perlin noise maps, for example, though going this route these won’t have any particular relationship to landforms or location on your map/planet.

Another route is to generate your broad-scale elevation up-front – say the first couple octaves of your terrain height function. This isn’t biome-level detail yet, more like the underlying bedrock under the biomes. Some folks will even generate tectonic plates and simulate their motion to find where there should be mountain ranges formed by two plates colliding, etc.

Once you have this coarse-grained elevation detail (and correspondingly, the locations of oceans/seas based which areas are below sea level), you can generate other metrics like temperature and moisture from that. Say, the temperature gets colder as you move higher in elevation or away from the equator in latitude. You can model the moisture based on proximity to an ocean or sea, or the prevailing winds: if winds at a particular latitude run mostly west-to-east, then areas on the east side of a mountain range will tend to be wetter, and areas on the west side dryer, as the mountain wrings the rain out of the air as it passes.

Now you have, for each site in your map, a collection of input values like latitude, elevation, moisture, and temperature. You can use these as coordinates to look up into a biome assignment map.

Here’s an example diagram from Navarras on Wikipedia, showing how real-world biomes relate to precipitation (moisture) and temperature:

A game might use a more abstracted version, like this example from an older version of Minecraft, via the Minecraft wiki:

You could even add more inputs like the elevation or latitude into the mix, like this plot of the Holdridge Life Zone Classification Scheme from Wikipedia:

With our computed biome driver values for a given point, we can find which region of this plot we fall into, to determine the predominant biome for this location.

For step 2, we can also determine how close we are to other biomes in this map, but this time instead of computing our distance in barycentric coordinates within our triangle of zones, or world-space distances across our terrain, we’re computing our distance in moisture-temperature space (or whatever driving inputs you’ve chosen for biome selection).

This route ostensibly gives a stronger connection between the landforms of your continents and the details of your biomes, but it comes at the cost of less direct control over the placement of those biomes. With the wrong parameters, you could easily end up with a whole world of deserts, or a world with only scarce islands of a particular biome. And maybe that’s desirable for the variety of worlds from your procedural generator. Or maybe it’s not, and you’d prefer to sacrifice some geographical logic (that maybe only the hardcore Earth sciences geeks will notice or appreciate anyway) in order to have more hands-on control over the gameplay combinations of biomes the players will encounter, as you can get with zoning-style approaches.

Whichever route we’ve taken, we now have a list of nearby biomes influencing this point, and a relative weight to apply to each one. Now we can run our terrain generation logic for each biome at this point, and blend the results using those interpolation weights, to arrive at one consensus elevation (and other properties) for this location.

For the emergent scheme, note that you might already have a coarse-grained elevation selected, so your individual biome generators can be more flat, capturing only local terrain shapes, and delegating the broad-scale landforms to the bedrock pass you did earlier. This makes it easier to get a clean blend between adjacent biomes that won’t have a sudden artificial-looking ramp between them because they disagree about the average elevation in their domain.

You can even use the coarse elevation value as a bias to your blending function, allowing eg. forest biomes to creep further into the valleys while letting rocky biomes dominate the higher elevations, adding a little non-linearity so the blend feels more organic.