lightroom – Family Photos System (Digital Asset Management/Organization/Storage Options?)

In terms of backup, I don’t think it sound. But it’s better than nothing.

Putting pictures in the cloud can add flexibility to a workflow. It’s a wonderful technology. And for a business use, it is probably good enough as a backup because business records have finite retention periods and business can buy operational interruptions insurance.

But your pictures are not fungible. Money can’t replace them. They don’t have a short retention period. And pictures in the cloud are not under your control.

Miss a payment, they are gone.

Catastrophic data center event, they are gone.

Change in the host’s business model, gone.

Your account compromised, gone.

Plain vanilla ordinary operator error by you, gone. And these are more likely with the cloud because you will be touching the storage all the time for ordinary operations not just backup.

A good backup strategy is the opposite: the backup is offline and read only and redundant. Tactically, backup is based on reducing failure modes and creating multiple paths to recovery.

The cloud can be a convenient skirmish line. But it’s no substitute for hard disks in safe deposit boxes. And another with a family member in another town. And so on.

icloud – Photos app on mac won’t sync OSX Big Sur

I’ve searched everywhere online and cannot seem to find an answer to my issue with the Photos app on Mac. Currently the Photos app is blank with no contents, no synced photos, nothing… every page and tab is blank.

On my mac I’m signed into the same apple-id that I use on my iPhone and I have about 2GB of photos.

This is what I see on the photos app on mac:

iCloud Image

In order to turn on iCloud photos, online instructions say to go to “System Preferences” -> “Apple ID” to turn on iCloud photos

But when I go there, its already turned on:

Apple ID page in System Preferences

I’ve tried the rebuild Photos app library option by launching the app with Command + Option selected. It hasn’t worked.

My setup:

  • Mac OSX Big Sur
  • iPhone 8 iOS 14.4
  • Both signed into the same apple id with icloud photos turned on.

I feel like an insane person.. can someone help me out?

astrophotography – How can I achieve more clarity in my photos of the moon?

Some possible reasons, arranged in the likely order of influence, for the lack of clarity in the example photo:

1) The optical limits of your lens. The EF 100-300mm f/4.5-5.6 was released as a budget telephoto zoom lens in 1990 at the dawn of the EOS era. Compared to the current EF-S 55-250mm f/4-5.6 STM, at the longest focal lengths and widest apertures there’s a significant difference in sharpness.

35mm film is much less demanding of a lens in terms of resolution than modern digital sensors such as the one in your 20MP 70D. From an answer to a question about the difference between “digital lenses” and “film lenses”¹:

Although not universally the case, most lenses designed and introduced during the digital age are better than their older film era counterparts, especially in the consumer and mid grade sectors. Manufacturers of the top tier lenses have also been forced to introduce newer versions of old classics. The new consumer lenses may not be as good as the old “L” glass (but sometimes they get close), but they are much better than yesterdays consumer lenses. Especially zoom lenses which have benefited tremendously from computer aided design and modeling. What used to take weeks or even months to test by making a physical prototype can now be accomplished in a few hours using supercomputer simulation.

Users of digital cameras tend to expect more out of their lenses due to primarily two factors:

  • Digital sensors are perfectly flat. Film isn’t. Some of the most expensive film cameras actually had mechanisms that created a vacuum behind the film to aid it in laying as flat as possible while being exposed. Even then, with color film the emulsion layer for each color was at a slightly different depth. So if focus was perfect for one color, it would be slightly off for the other two!
  • Pixel peeping has raised expectations to a ridiculous level. Take a 20MP image and display it at 100% (1 pixel per screen pixel) on an ≈23 inch HD (1920×1080) monitor and the magnification is equivalent to printing at 56×37 inches! No one expected a 35mm consumer grade lens to be perfect at 56×37! But a lot of folks now seem to.

2) Shooting a very dim object that is moving across the frame. One second is far too long to expose the moon using a 300mm focal length without a tracking mount if one is going to critically look at the image at 100% magnification. At 100%, it is easy to see the trails of the two bright stars in your example photo. The moon is also blurred by approximately the same amount of movement.² The moon is not normally a dim object, so we usually do not need to worry about our shutter times being too slow. Even though we usually shoot it at night, the moon’s surface is being directly illuminated by the sun. At ISO 100 and f/8, we would normally expose the moon for about 1/125-1/250 second. But during a total eclipse, when the earth blocks the sun’s direct light from illuminating the moon, the moon’s surface gets a LOT darker.³ The earth still rotates at the same rate underneath the sky. The reduced brightness pushes us into a very tight corner regarding how to collect enough light for a usable image without the apparent motion of the moon making it blurry. The most obvious solution is to use a wider aperture – if one is available. But even moving from, say, f/8 to f/2.8 only gains us three of the thirteen-plus stop difference between a full moon and totality. Going from 1/250 second to 1/15 second only gains another four stops and at 300mm we are already going to start seeing motion blur when pixel peeping. At that point we’re still about 3-6 stops dimmer than when the moon is full. Going from ISO 100 to ISO 1600 gets us back in the ballpark, but we have given up a lot in terms of clarity due to:

  • The much slower shutter time allows some motion blur
  • The wider aperture (most lenses are sharper stopped down than when used wide open)
  • The higher noise associated with using higher amplification (ISO) to make up for less light entering the camera, and the resulting noise reduction we use.

3) Atmospheric interference. If you were shooting from the location indicated in your user profile, the moon was fairly low on the horizon at the time. Just as the sun is much more distorted by the earth’s atmosphere at sunrise and sunset than when it is high in the sky, so is the moon. Not only is the light having to travel further at an angle through the ocean of air surrounding our planet, but the temperature differentials near the terminator (the line between daylight and dark) tend to increase atmospheric turbulence in the times around dawn and dusk.

4) Letting the camera make all of the decisions about how to process the raw data from the sensor. This is particularly the case with a dim object, such as the moon during totality, that is moving across the frame. This limits our exposure time. Most great moon photos (when it is not in the earth’s shadow) you see are saved in a raw file format and post-processed to fine tune the contrast between darker and lighter areas on the surface of the moon. Color temperature and white balance adjustments, sharpening, and in some cases even digitally applied color filters, can bring out the contrast between different areas of the moon. This is even more critical when the photo in question is taken during a total eclipse.

5) The noise reduction applied to using ISO 1000 with a Canon EOS camera. I’m a Canon shooter because, overall, Canon works for what I do. Every system, though, has advantages and disadvantages. One of the things where Canon falls a little short is in the way their cameras handle the “partial” stop ISO settings. For a comprehensive look at how Canon cameras handle the “partial stop” ISO settings and why using the “+1/3 stop” ISO settings (such as ISO , 250, 500, 1000, 2000, etc.) can make your photos noisier than other ISO settings that are even higher, please see Is it really better to shoot at full-stop ISOs?. The amount of NR the camera applies to ISO 1000 by default will reduce the detail in the image.

¹ Back near the beginning of the consumer digital SLR era, APS-C only lenses were often marketed as “digital” lenses.

² The moon moves roughly 1/2° less per hour than the stars as viewed from the earth’s surface. That also happens to be approximately the moon’s angular size in the sky. So for a one second exposure, the moon would move across the frame 1/3600 of its own diameter less than the nearby stars would during the same exposure.

³ This article from Space.com says anywhere from 10,000 to 100,000 times dimmer, depending on the earth’s atmospheric conditions. That’s between 13 and 17 stops darker than a full moon!

lightroom – Family Photos System

I’m sure folks here have a good approach to photo storage/processing and backup. I’m a dad with 4 kids and lots of family photos. I had my first digital camera in 1996 and have pictures stretching back over a number of years. Over time, I used apple photos, Flickr, Google Photos, Lightroom and Dropbox.

I spent a lot of time over the last week pulling all my photos into one folder and removing duplicates. I’m down to about 200GB of photos. I have an Adobe CC subscription and also pay for Google One and Dropbox business. I like the hybrid of local storage with cloud as Backup and Sync.

I’m thinking about this system going forward:

Family System

So I would connect my phone/camera to any computer and get the photos into Dropbox. Occasionally, I would organize, edit them in lightroom. Dropbox would back them up and I would also use Google Cloud sync to have them in Google Photos, which is good for integration (i.e. show the pics using Rasp Pi via Dakboard) and also lets us pull up photos on phones, etc.

Is this a good/bad idea? Would love to hear about any better systems.

color spaces – Where can I see ProPhoto RGB photos

The ProPhoto color space is just another way of describing RGB colors. sRGB, AdobeRGB, and ProPhoto can all describe/display the same colors (using different numbers); and the larger color spaces can describe/display colors the smaller spaces cannot. But just because a color space can describe a given color does not mean an image/scene contains that color.

A monitor is an RGB device, and if you have a monitor that calibrates as “100% AdobeRGB” that does not make it an AdobeRGB device… it has its’ own profile. The output from a digital camera is RGB as well, but its’ input does not have a color space at all.

So take a digital camera that can record (react to) all visible light, and even some light that is invisible (IR/UV), in raw… that raw file output would exceed all color spaces in some aspect. You then edit that file in Lightroom (ProPhoto color space) and display it on your 100% AdobeRGB monitor. The monitor space/gamut/capability is the limit of what you can see; but it is much greater than sRGB.

Or you use the camera to record a jpeg, and you set your camera to use the sRGB color space instead of the AdobeRGB. So that same image is now limited to sRGB no matter where/how it is displayed. And if your monitor calibrates as 98% sRGB you would probably never see a difference.
But if printed it would not take advantage of the full CMYK color space/capability; whereas an image that was originally in the ProPhoto color space could (assuming the original image contained those colors).

enter image description here

The point of recording as much as possible (raw files), and maintaining as much as possible (ProPhoto), is to take advantage of those gains where possible now; and to enable potential further advantage in the future. Currently there are no monitors that can display all of the visible colors w/in ProPhoto, but there may be someday. And there are currently many monitors and printers that can exceed the capability of sRGB; but to take full advantage of that requires effort.

photos.app – How to fix Photos error while trying to open iPhoto library from external hard drive “Photos was unable to open the library” error 4302

Running Catalina 10.15.7 (19H524)

The iPhoto Library has been stored on an external hard drive. Tried opening it various times. Restarted the computer. Remounted the hard drive.

Any idea how to get it to open?

Found this thread googling – https://discussions.apple.com/thread/250721038 – but don’t understand the proposed solution.

enter image description here

photos – Can i store and access iphotos library on external hard drive (and still have it backup to my cloud network)?

Apologies for my technical ineptness. What i aim to do i store my large iphoto library on an external HDD and access it from that so it does not us my laptops more limited SSD storage. Is it possible i can set this hard drive to backup via my computer to the same local network attacked storage that my computer backs up to. If so how would this been set up?

Cheers, Ben.

internal storage – Photos made this afternoon disappeared after I rebooted phone – how to recover them?

This morning, since the memory of the internal storage reached its limit, I moved all the pics in the Camera album to the SD card. In the afternoon, I took a couple of pics which if I remember correctly were stored by default in the same album Camera (in the SD) in which I moved the other pictures in the morning. I checked them on the way home and they were still there. At some point, I wanted to edit one of the new ones but it kept giving me errors (no connection and “you can edit only pics larger than 50 x 50 pixels”- and I was surprised since they are always larger than that!).
I rebooted my phone, and when I opened the gallery the pics taken in the afternoon were gone (not to consider how many pics got corrupted among the ones I moved in the morning). I took other pictures, still ended up in the Camera folder (in the SD!), still editing error. I rebooted again, and they disappeared again.

Even weirder, after some time a retook some pics and these were saved in the Camera folder in the internal storage (Not in Camera SD!), and they are not disappearing anymore.

I already tried re-inserting SD. What can I do?

360 panorama – Mathematics to go from a set of 360 photos (bunch of JPGs with spherical or rectangular projection) into a 3d model with dimensions?

How is a set of multiple 360 pictures (such as Matterport ones) converted to a 3D model with dimensions?

Can it be done? If yes, which is the math behind it?

Is the 3D model usually a point cloud created with photogrammetric techniques?

How does this differ from the conversion from 2D pictures to 3D model that is done with outdoor drones?

The latter takes a set of 2D pictures at different locations with ~80% overlap, creates the point cloud (3D “discrete” model) with photogrammetric techniques, then it gets the 3D “continuous” model by meshing the point cloud. It also needs RTK-GPS for cm-accurate positioning, or alternatively a bunch of Ground Control Points with known coordinates.

How is the process different if the starting point is a set of 360 pics?