Is installing another desktop environment on a stock Ubuntu 20.04 install consider reasonably safe/ok. Low risk for serious issues

Currently running an up to date stock Ubuntu 20.04 desktop with Gnome. In the past I’ve heard it can cause some problems/issues if one installs multiple desktop environments. Not sure if that sentiment is true or if it’s still true today. With that said, if one wanted to install the Budgie DE on a stock Ubuntu 20.04 install, is that considered generally safe/ok? Low risk for serious issues?

python – ModuleNotFoundError: No module named ‘flask_wtf’ without virtual environment

i use python3 and i start my web with
sudo python3 app.py

i use no virtual environment

pip3 list shows Flask-WTF 0.14.3 is installed.

but from flask_wtf import FlaskForm brings the error ModuleNotFoundError: No module named 'flask_wtf'

both files are in the same folder

can someone help, please?

regards
Andreas

i searched stackoverflow but find no solution

magento2 – Setting varnish full page cache settings via environment variables

I’m trying to set the varnish settings through environmental variables. I have the following ones set:

CONFIG__DEFAULT__SYSTEM__FULL_PAGE_CACHE__VARNISH__BACKEND_HOST: varnish-service
CONFIG__DEFAULT__SYSTEM__FULL_PAGE_CACHE__VARNISH__BACKEND_PORT: 8090

If I go to the settings for Full Page Cache I can verify that it’s set to the above mentioned values. See image below.

full page cache settings

My problem comes when I’m trying to run bin/magento cache:clean. In my system.log I get the following error: main.CRITICAL: Unable to connect to varnish:80 . Error #0: stream_socket_client(): unable to connect to varnish:80. As you can see these values are not respected when cleaning the cache.

To solve this for now I need to run bin/magento setup:config:set --http-cache-hosts=varnish-service:8090 which solves the issue but I would still like Magento2 to respect the values I set through the environment variables so I can control it without running commands on the containers.

Magento version is 2.3.1 and I’m running my containers in Kubernetes.

bash – Ansible to automate dev environment setup

This is a follow-up to Bash script to automate dev environment setup.

In that question I’d thrown together a (sloppy) shell script to automatically setup my development environment. One of the answers suggested using Ansible and after a bit of reading realized it would help me with some configuration of remote servers as well so I decided to give it a go.

Below is playbook that sets up my dev environment the same way as the original bash script but hopefully a bit cleaner. I’m planning on using Ansible to setup a new CI/CD pipeline in a reproducible way as a replacement for the github -> dockerhub -> manual deployment so this is really testing the waters with Ansible before moving on to that.

Right now it all I need to do is clone the repo the two files are in and then run bootstrap.sh and everything gets set up from there.

Any/all pointers appreciated!

bootstrap.sh:

sudo apt update && sudo apt -y upgrade
sudo apt install -y ansible
mv AnsibleDevEnv/setup.yml ~/
ansible-playbook setup.yml
. .bash_profile

And then the Ansible playbook:

---
- name: Dev Setup
  hosts: localhost
  vars:
  
    folders:
    - go
    - python
    - js
    - pemKeys
    
    downloads:
      url:
      - https://deb.nodesource.com/setup_14.x
      - https://repo.anaconda.com/archive/Anaconda3-2020.02-Linux-x86_64.sh
      - https://storage.googleapis.com/golang/getgo/installer_linux

      sudo_files:
      - setup_14.x
       - Anaconda3-2020.02-Linux-x86_64.sh
        
      user_files:
      - installer_linux
      
    keys:
      - https://packages.microsoft.com/keys/microsoft.asc
      - https://download.docker.com/linux/debian/gpg
      
    repos:
      - deb (trusted=yes arch=amd64) https://download.docker.com/linux/debian {{ docker_version.stdout }} stable
      - deb (arch=amd64) https://packages.microsoft.com/repos/vscode stable main
      
    packages:
      - apt-transport-https 
      - ca-certificates 
      - gnupg2 
      - software-properties-common
      - libgl1-mesa-glx 
      - libegl1-mesa 
      - libxrandr2 
      - libxrandr2 
      - libxss1 
      - libxcursor1 
      - libxcomposite1 
      - libasound2 
      - libxi6 
      - libxtst6
      - libpq-dev 
      - python3-dev
      - python3-pip
      - protobuf-compiler
      - apt-transport-https
      - code
      - nodejs
      - postgresql-11
      - docker-ce
      
    node_lib:
      - react
      - react-scripts
      - react-dom
      
    go_get:
      - go get github.com/lib/pq
      - export
      - GO111MODULE=on go get github.com/golang/protobuf/protoc-gen-go
      - GO111MODULE=on go get -u google.golang.org/grpc
      
    pip:
      - psycopg2
      
    git_config:
      name: 
        - user.name
        - user.email
        - color.ui
      value:
        - cmelgreen
        - cmelgreen@gmail.com
        - true

  tasks:
    - name: make folders
      file:
        path: './{{ item }}'
        mode: 0755
        state: directory
      with_items: '{{ folders }}'
      
    - name: install rpm
      apt:
        name: rpm
        state: latest
        update_cache: yes
      become: yes
        
    - name: add keys
      apt_key:
        state: present
        url: '{{ item }}'
      with_items: '{{ keys }}'
      become: yes
      
    - name: save docker version to variable
      shell: lsb_release -cs
      register: docker_version
      
    - name: add repositories
      apt_repository: 
        repo: '{{ item }}'
        state: present
      with_items: '{{ repos }}'
      become: yes
        
    - name: download files
      get_url: 
        url: '{{ item }}'
        dest: .
        mode: +x
      with_items: '{{ downloads.url }}'
        
    - name: run as root downloads
      command: './{{ item }}'
      with_items: '{{ downloads.sudo_files }}'
      become: yes
      
    - name: run as user downloads
      command: './{{ item }}'
      with_items: '{{ downloads.user_files }}'

    - name: add source ./.bashrc to .bash_profile
      lineinfile:
        path: ./.bash_profile
        line: 'source ./.bashrc'
        
    - name: install packages
      apt: 
        name: '{{ packages }}'
        state: latest
        update_cache: yes
      become: yes
       
    - name: set docker permissions
      file:
        path: /var/run/docker.sock
        mode: 0666
      become: yes

    - name: install react
      npm:
        name: '{{ item }}'
        global: yes
        state: latest
      with_items: '{{ node_lib }}'
      become: yes

    - name: go get some libraries
      shell: '. ~/.bash_profile && {{ item }}'
      args:
        executable: /bin/bash
      with_items: '{{ go_get }}'


    - name: pip some stuff conda has a hard time with
      pip:
        name: '{{ pip }}'

[GET] The Complete Environment Painting Super Course | 14 hours+ | Proxies-free


  1. Ziplack

    Ziplack
    VIP UPLOADER


    Joined:
    Dec 9, 2014
    Messages:
    1,054
    Likes Received:
    130
    Trophy Points:
    63

    [GET] The Complete Environment Painting Super Course | 14 hours+

    Hidden Content:

    You must reply before you can see the hidden data contained here.

     

  2. valin
    New Member


    Joined:
    Today
    Messages:
    1
    Likes Received:
    0
    Trophy Points:
    1
    Gender:
    Male

2013 Workflow Continuously Restarts in 2019 Environment

We are upgrading our 2013 on prem farm to 2019 on prem. We’ve completed creating the 2019 farm and migrated the data using AvePoint, a migration tool. I have a 2013 workflow that set to run on change but it continuously restarts itself on the 2019 farm. It restarted 21 times in one minute and locks the item for editing while it’s running. Is there a setting we’re missing?

In our 2013 farm, the 2013 workflows that are set to run on change will only run/be triggered if someone physically updates the item manually. In other words if the system makes updates behind the scenes after the form is submitted the 2013 workflow is not triggered.

How do I compile programs using the windows 10 linux environment?

I am a beginner to linux. I’ve installed the Ubuntu and Debian subsystems for windows. I have to admit, I expected them to behave like powershell and command prompt do, with commands like ls (or dir, or other variations on these) working. Therefore, I expected that I could drop a file into the virtual directory used by the subsystem, and type something like gcc … “myfile.c” to compile and build, and then run the program in the subsystem.

However, when I use bash, I can’t tell where my linux directory is, it won’t let me navigate anywhere, commands, like ls, return no information, and I generally cannot find documentation about how to use the bash to do simple things like compile programs.

Can someone help me, please?

linux – Bash script to automate dev environment setup

First of all, I would suggest using shellcheck when writing bash https://www.shellcheck.net/, it will point out many many errors, some trivial and some not so trivial.

Line 7:
mkdir $HOME/GoProjects
      ^-- SC2086: Double quote to prevent globbing and word splitting.

Simple enough, if your $HOME has a space in it say, the mkdir won’t work as you expect.

More subtle

Line 4:
read email
^-- SC2162: read without -r will mangle backslashes.

Maybe not a problem for you, but no harm in doing it right.

You can use shellcheck in editors via plugins etc, there’s a cli you can use. Really nice, bash is very easy to make mistakes in as we all know and this will just generally help you.

Do people organise bash scripts, yes! I’ve known some old hands who are very skilled with bash. As far as I know there’s no consensus on the structure to follow, but generally I think you want to be following good programming guidelines, split things into functions, make it modular, etc. It’s good that you have comments, but I think better still to replace the comment with a function with a sensible name which does the part the comment alludes to.

That being said, what you have is basically as far as I’m concerned one of the canonical applications of a bash script, i.e. not doing much complicated, just running a bunch of commands in order. You probably use it one off infrequently. Is it worth making it any better? Probably not.

You have no error handling, and various things are hard coded assumptions about the machine you will run this on, your script isn’t idempotent, when something does go wrong, when you run it the second time it may have funny results. Idempotency is another good thing to aim for. Is that any of this is missing bad, not really.

Personally, I don’t like bash scripts. I prefer to use a programming language which is a bit more verbose but allows me to be more confident of what I’m doing. For this reason I write my scripts in node or python when I can, as these allow me to accumulate a collection of functions which I find easier to reason about, do error handling with, and externalise configuration.

man in the middle – How does a person under surveillance safely download tor or tails in a hostile environment?

One of tor’s stated goals is to help individuals such as journalists, activists and whistleblowers protect against surveillance, and in many countries people in those lines of work or activities are usually subject to surveillance, especially targeted surveillance.

Given a scenario in which a journalist working in an environment where he is subject to active targeted surveillance, how would he safely download tor? Assume that the journalist in question is using a new computer with a freshly installed Linux distribution. In what ways could an adversary with man-in-the-middle capabilities affect or compromise the download?

Does using https to download TAILS or the distribution package manager to download tor provide enough security to protect from malicious third-parties?
How can someone in this scenario safely download tor or TAILS?

Alert/Report on any permission change in SharePoint Environment

I have a SharePoint farm as well as a SharePoint online tenant which need to monitor for any permission change.

I want some alert that will be trigger a email and notify any change in farm or tenant level in SharePoint. May it item level permission or site collection level.

or if not an alert then a daily / weekly report will suffice the need.