SQL Server database cannot be attached with shared folder in Docker on Linux

The problem is that my SQL Server 2017 Linux container cannot read my database files. I think it has something to do with permissions and I'm sure I've missed something, but I can't figure it out.

I created a clone using the PowerShell module dbaclone.
This module creates and attaches a differentiating disk based on another virtual disk, which simplifies the deployment of large databases.

The databases are available through a partition access path that looks something like this

Enter the image description here

In the folder in the example "AW2017-C1" there is a folder called "Data" and "Protocol", which contains the data or the log files.

The permissions for the folder are "Everyone" to have "Full Control".

Enter the image description here

I have a VMWare Workstation virtual machine that is a Linux VM. Docker is installed on the Linux VM. I released the clone folder in the VM

Enter the image description here

To make it easier to use, I added the share to / etc / fstab to automatically deploy it when the VM started.

Enter the image description here

The permissions are all set so that everyone can read them

Enter the image description here

I have the following script to run the Docker container

sudo docker run -e 'ACCEPT_EULA=Y' 
    -e 'SA_PASSWORD=Myp@ssw0rd' 
    -p 1433:1433 --name sql1 
    -v /home/sander/databases:/databases 
    -v /home/sander/shares/dbaclone:/var/opt/mssql/data/dbaclone 
    -d mcr.microsoft.com/mssql/server:2017-latest

The Docker container provides the directory and makes it accessible to the container.

The file permissions inside the container look correct

Enter the image description here

Everything looks fine, but when I run the following script

USE master;

CREATE DATABASE AdventureWorks2017
ON PRIMARY
       (
           FILENAME = '/var/opt/mssql/data/dbaclone/AW2017-C1/Data/AdventureWorks2017.mdf'
       ),
       (
           FILENAME = '/var/opt/mssql/data/dbaclone/AW2017-C1/Log/AdventureWorks2017_log.ldf'
       )
FOR ATTACH;

This error is coming back

Message 5120, level 16, state 101, line 3
The physical file "/var/opt/mssql/data/dbaclone/AW2017-C1/Data/AdventureWorks2017.mdf" cannot be opened. Operating system error 2: "2 (The system cannot find the file specified.)".
Message 1802, level 16, status 7, line 3
CREATE DATABASE failed. Some of the file names listed could not be created. Check the related errors.
Completion time: 2020-02-19T13: 07: 36.1675676 + 01: 00

Obviously this has something to do with permissions, but everything looks fine and maybe someone else has some pointers on how to fix this.

Terminal – Compare two folder structures and list files that exist in both but differ

I have a local working folder that reflects part of a web server's public folder. I usually work in the local copy and then automatically upload files to the server when saving. The problem is that I've noticed lately that many files in my local files seem to be out of date. So if I save and upload a file, I may overwrite a newer version. This is obviously problematic, so I want to update all outdated local files.

The best way to do this is to download the entire public folder as is and compare each file to my local copy, manually searching for files with differences (by comparing them in Visual Studio code). The public server folder contains about 5 GB of additional material that I don't need (or don't want) in my local working folder, so I have to filter out the unwanted material first.

In other words, I'm looking for a way (GUI or Terminal) to do the following:

  • Provide two top-level directories as input
  • Iterate recursively through both directories and select files that exist in both directories (in the same relative location).
  • Compare each set of matching files and list those in which the two files are located Not identical

Is there a fairly straightforward way to do this?

How do I automate the SharePoint document library with a local folder using PowerShell or with another?

I want to create a solution where documents from the document library are automatically synchronized with the local folder. Not only can the synchronization be downloaded at the same time, but all changes made to the synchronized document should be uploaded back to SharePoint. I found only a few articles, but these speak of how to download them.

Is there any way to do this through PowerShell or with an onedrive API if available?

https://powershell.org/forums/topic/backup-sp-online-document-library-locally/
https://www.c-sharpcorner.com/blogs/get-all-documents-from-a-sharepoint-document-library-to-a-local-folder-using-powershell

Magento CACHE folder very large?

I checked into the / var / Magento folder and found that the cache folder is not cleaned up when I clean up FPC in my Magento administrator. The file size remains the same. How am I supposed to fix that?

What is strange, there are also 2 files "cache_fpc" & "cache_fpc–", when I clean up FPC in Magento, the file "cache_fpc" changes and gets smaller, but the file "cache_fpc–" remains the same size?

What is the "cache_fpc–" file ???

Magento is version 1.9.3.8

greetings

ubuntu – Allows web users to write to a folder on Nginx

I have a website on the Nginx server on Ubuntu and I uploaded profile pictures in Django. However, when I try to upload a new profile picture, it says

[Errno 13] Permission denied: & # 39; /home/xyz/djangodir/media/profile_pics/profilepicxyz.png'

How can I allow users to write in that? /profile_pics Folder is?

I'm an absolute beginner in servers and nginx, so it could be a really easy task.

Which GSA files are stored in the "Failed" folder?

Hi there!
I was wondering if GSA is storing the list of failed websites somewhere.
For example, to record blog comments, I would like to run a first project only with CapMonster / CB and then run the failed sites again with 2captcha. Suppose the reason for the failure was an incorrect captcha solution.
I deleted the contents of the "Failed" folder to restart, but it is still empty after 10 hours of submissions / reviews.

How do I install SQL Server 2017 Express in "Quiet Simple" mode without an extraction folder?

I need to install SQL Server 2017 Express with as little user interaction as possible. I use that SQLEXPR_x64_ENU.exe Setup file for which I found a Microsoft download.

Currently I can do exactly what I want with 2012 using the following parameters:

/FEATURES="SQL, Tools" /QS /IACCEPTSQLSERVERLICENSETERMS /ADDCURRENTUSERASSQLADMIN=1 /ACTION="Install" /ERRORREPORTING=0 /INSTANCENAME="MyDB"

This works great in 2012, but I'm now trying to do the same in 2017. The problem I have is that an extraction folder called "SQLEXPR_x64_ENU" is created in the same directory as the installation. That is not desirable.

question: Is there a way to prevent this extraction folder from being used so that it works the same way as the 2012 setup?

Alternatively, one of the following solutions would be acceptable:

  • Allow the user to choose the extraction location as is the case with the standard installation (not silent), but ensure / QS mode for the rest of the setup
  • Have the extraction folder automatically deleted after setup (it wouldn't be so bad if it were cleaned up after exiting)

How do I install SQL Server 2017 Express in "Quiet Simple" mode without an extraction folder?

I need to install SQL Server 2017 Express with as little user interaction as possible. I use that SQLEXPR_x64_ENU.exe Setup file for which I found a Microsoft download.

Currently I can do exactly what I want with 2012 using the following parameters:

/FEATURES="SQL, Tools" /QS /IACCEPTSQLSERVERLICENSETERMS /ADDCURRENTUSERASSQLADMIN=1 /ACTION="Install" /ERRORREPORTING=0 /INSTANCENAME="MyDB"

This works great in 2012, but I'm now trying to do the same in 2017. The problem I have is that an extraction folder called "SQLEXPR_x64_ENU" is created in the same directory as the installation. That is not desirable.

question: Is there a way to prevent this extraction folder from being used so that it works the same way as the 2012 setup?

Alternatively, one of the following solutions would be acceptable:

  • Allow the user to choose the extraction location as is the case with the standard installation (not silent), but ensure / QS mode for the rest of the setup
  • Have the extraction folder automatically deleted after setup (it wouldn't be so bad if it were cleaned up after exiting)