t sql – SQL Server Automate table updates from one environment to another

I'm looking for a best practice / option to only update a table from the prod-to-staging environment.

Technical details:

  1. The table in Source is loaded with new data every day.
  2. The table in the target should only display new data when it is updated.

Some of the options I think:

  1. Replication (transaction)
  2. SSIS
  3. TSQL script – SQL agent job

My first preference would be to go with option 3. Will this be a good option? If so, someone can lead me to a script.

office365 – Can you use Microsoft Planner in a home environment?

I have been using Planner for a while in a business environment and would like to use it at home (for personal use).
I have a valid Office365 license and all other apps (Excel, Word, PowerPoint, Outlook, OneDrive) work fine. There is no problem with my license. But it appears that Planner is simply not available on the Office365 website.

Is it not possible to use Planner without an in-house server? Or how would I get in?

Run Terraform Azure infrastructure code to create an environment

I'm currently starting my Terraform from my laptop, which is obviously not optimal:

module "eu_resource_group" {
  source                        = "./modules/resource_groups"

  resource_group_name           = var.resource_group_name
  resource_group_location       = var.location
}

module "vault" {
  source                        = "./modules/vault"

  resource_group_name           = module.eu_resource_group.eu_resource_group_name
  resource_group_location       = module.eu_resource_group.eu_resource_group_location
}

module "storage" {
  source                        = "./modules/storage"

  resource_group_name           = module.eu_resource_group.eu_resource_group_name
  resource_group_location       = module.eu_resource_group.eu_resource_group_location

  storage_account_name          = var.storage_account_name
  storage_container_name        = var.storage_container_name
}

I would like to let azure run that.

How would I let azure run that? Creating an Azure Devops project creates a CI pipeline, etc.

How do users run their Terraform infrastructure code in a non-local environment?

Unity – When I move my character / camera, the environment stutters

I am learning to make a tilebased game with Unity using the Zelda Alttp assets.
I only have tiles, a camera and a moving character with a Rigidbody2D and a CircleCollider.

When the camera follows the character, the scene feels jerky.

The information:

  • The camera uses the 2D Pixel Perfect package.
  • The character moves over Rigidbody2D.MovePosition () in the FixedUpdate Method:

Code:

void Update()
{
    float yInput;
    float xInput;

    #region yMovement
    if (Input.GetKey(keyMoveTop))
    {
        if (Input.GetKey(keyMoveBot))
        {
            yInput = 0f;
        }
        else yInput = 1f;
    }
    else if (Input.GetKey(keyMoveBot))
    {
        yInput = -1f;
    }
    else yInput = 0f;
    #endregion

    #region xMovement
    if (Input.GetKey(keyMoveLeft))
    {
        if (Input.GetKey(keyMoveRight))
        {
            xInput = 0f;
        }
        else xInput = -1f;
    }
    else if (Input.GetKey(keyMoveRight))
    {
        xInput = 1f;
    }
    else xInput = 0f;
    #endregion

    //Walking
    isWalking = Input.GetKey(keyMoveWalk);
    moveDirection = Vector2.ClampMagnitude(transform.right * xInput + transform.up * yInput, 1f);
}

private void FixedUpdate()
{
    _rb.MovePosition(_rb.position + moveDirection * (isWalking ? walkSpeed : runSpeed) * Time.fixedDeltaTime);
}

You can see the problem here:
https://youtu.be/MA2zZPME5X4

Edit: All my motion code added

Strategies for Test Environment Variables – Software Engineering Stack Exchange

I am in the process of implementing an API for the OPTIONS request for a pre-flight check on CORS calls.

The allowed origins host between "local", "test" and "prod" is all different. So I moved this host to a dotenv file.

Now when I create my unit test to check this, there is a problem

Suppose local is localhost: 3000, but test is test.site.com and prod is site.com

So now the test, the header verified

assertTrue(response.getHeaders()("Access-Control-Allow-Origin")(0)==="localhost:3000")

Is only passed in local.

A few ideas I had and reason why I think they wouldn't cover the cases:

Hard code like above under any environmental condition would leave a failed test in a different environment.

Download the env files and check them, but this is useless as I confirm that the file matches the file.

Load the env file for each env and code the expectation in the assurance. Could work the plan is not to have local or come across in the test.

Linux Docker | unknown environment `bash` | Subprocess / usr / bin / dpkg returned an error code (1)

My goal is to get a Docker container up and running with nordvpn installed and connected.

Get Docker Container going

sudo docker pull ubuntu:latest
sudo docker run -it ubuntu bash
// now im in the docker container
apt install update
apt install wget
wget {{nordvpn_link.deb}}
dpkg -i {{nordvpn_link.deb}}
// some errors about dependencies after above command so ...
apt install -f
// then
apt install nordvpn

First big mistake

root@f706a3f4012f:/home# apt install nordvpn
Reading package lists... Done
Building dependency tree       
Reading state information... Done
nordvpn is already the newest version (3.6.0-2).
0 upgraded, 0 newly installed, 0 to remove and 1 not upgraded.
1 not fully installed or removed.
After this operation, 0 B of additional disk space will be used.
Do you want to continue? (Y/n) 
Setting up nordvpn (3.6.0-2) ...
(ERROR) Unknown environment `bash'
dpkg: error processing package nordvpn (--configure):
installed nordvpn package post-installation script subprocess returned error exit status 255
Errors were encountered while processing:
nordvpn
E: Sub-process /usr/bin/dpkg returned an error code (1)

I read here to run the following command

dpkg --configure -a
// errors
Setting up nordvpn (3.6.0-2) ...
(ERROR) Unknown environment `bash'
dpkg: error processing package nordvpn (--configure):
installed nordvpn package post-installation script subprocess returned error exit status 255
Errors were encountered while processing:
nordvpn

I'm not sure why this happens with the Docker container as the process went smoothly with my regular Ubuntu desktop installation.

Network – https websites cannot be accessed in an environment with secure Citrix cells

I installed an application in a Citrix Secure Cell environment that accesses https websites that are installed on IIS web servers that are off the network

What do all firewall settings have to activate so that applications installed in the Citrix Cell environment can access data?

Version control – What is the best GIT workflow with CI / CD with submodules in a master to a test environment and a stable branch to a production environment?

To be honest, I've always tried to find good workflows for dealing with submodules, especially for the reason you pointed out:

For this to work, we have to change the submodules every time we merge masters into stable.

I'm not a fan of the need to coordinate a lot of mergers across a number of repos to implement the daily changes. Do the main repo and submodules often change in tandem? Are the submodules also used in other projects or are they only used by the project in the main review? Would you be ready to get rid of the submodules and buy only one repo? I think your workflow would be as clear as you described if you omitted the submodules.

There are situations where submodules are required. However, if you decide that they mean more work and complexity than value, you can roll the submodules into your main review while preserving the history. In the main repo, remove the submodules completely and confirm this. Then, in the repos of the submodule, move the content of the repos to a folder with the name of the submodule and confirm it. Finally, return to the main repo, add the repos of the sub-module remotely, and add what you referred to as remote / master to your main repo branch with the option --allow-unrelated-histories,

Javascript – passing environment variables to VueJS application via Docker Compose-Up command?

So for several hours I tried to pass an environment variable through the docker-compose up Command and I can't seem to get it to work. I am relatively new to Docker and several methods have left me baffled.

Basically, I want to pass the API version as an argument when I call compose and can access this variable through the Vue.js application. From what I read, it should be saved on the internet process.env Object.

I tried the following:

dockerfile

# Get Base Image (Full .NET Core SDK)
FROM mcr.microsoft.com/dotnet/core/sdk:3.1 AS build-env
WORKDIR /app

# Copy csproj and restore
COPY *.csproj ./
RUN dotnet restore

# Copy everything else and build
COPY . ./
RUN dotnet publish -c Release -o out

# Generate runtime image
FROM mcr.microsoft.com/dotnet/core/aspnet:3.1
WORKDIR /app
EXPOSE 80
COPY --from=build-env /app/out .
ENTRYPOINT ("dotnet", "ServerName.dll")

docker-compose.yml

version: '3.5'

services:
    ms-sql-server:
        image: mcr.microsoft.com/mssql/server:2017-latest-ubuntu
        ports: 
            - "1430:1433"
    api:
        build: .
        image: omitted_path
        restart: always
        depends_on: 
            - ms-sql-server
        environment:
            DBServer: "ms-sql-server"
        ports:
            - "50726:80"
    client:
        build: ../../omitted
        image: omitted_path
        restart: always
        depends_on: 
            - api
        environment: 
            API_VERSION: "${API_VERSION}"
        ports:
            - "8080:80"

I call API_VERSION=v2 docker-compose up and everything builds and runs as it should. The process.env The object does not contain the default or the default value specified in the Docker file.