postgresql – postgres does not save all data after committing

In my golang project that use gorm as ORM and posgress as database, in some sitution when I begin transaction to change three tables and commiting, just one of tables changes. two other tables data does not change. any idea how it might happen?

you can see example below

o := *gorm.DB
tx := gorm.Begin()
invoice.Number = 1
err := tx.Save(&invoice)
if err != nil {
    err2 := tx.RollBack().Error()
    return err
}


receipt.Ref = "1331"
err = tx.Save(&receipt)
if err != nil {
    err2 := tx.RollBack().Error()
    return err
}

payment.status = "succeed"
err = tx.Save(&payment)
if err != nil {
    err2 := tx.RollBack().Error()
    return err
}

err = tx.Commit()
if err != nil {
    err2 := tx.Rollback()
    return err
}

Just payment data changed and I’m not getting any error.

import – How do I get the data from this binary file?

I have a file which contains the following binary data:

$BinaryData={2,0,0,0,0,0,0,0,20,0,0,0,0,0,0,0,224,123,92,59,148,254,214,1,46,0,0,0,67,0,58,0,92,0,85,0,115,0,101,0,114,0,115,0,92,0,97,0,116,0,102,0,97,0,105,0,92,0,105,0,67,0,108,0,111,0,117,0,100,0,68,0,114,0,105,0,118,0,101,0,92,0,84,0,104,0,105,0,115,0,32,0,105,0,115,0,32,0,97,0,32,0,116,0,101,0,115,0,116,0,46,0,116,0,120,0,116,0,0,0};

It is interpreted according to the following picture:

enter image description here

I can look at the file data in the format shown in the picture using:

Partition[IntegerString[
    $BinaryData,
    16,2
],16,16]//Column

Using this I pull out the relevant content as follows:

{$FileSize,$DateDeleted,$FilePath}=Rest@TakeList[$BinaryData,{8,8,8,All}];

Now my question is how do I get the content converted from this binary form to the appropriate form. $FileSize is an integer representing bytes. $DateDeleted is a number that should be convertible to date somehow and $FilePath is a Unicode string representing file path.

I am new to parsing binary so I might be doing things wrong. Feel free to correct me.

amazon web services – I want to get aws kinesis data using Kubeless

I built a controller that detect writing to kinesis.
(kinesistriggers.kubeless.io)enter link description here

When this controller detects a write to kinesis
python function associated with the stream is executed.

but, although the write to kinesis was successful, the function was not executed.

This python just prints (event).

def hello(event, context):
  print(event)
  return event('data')

The environment is as follows

$kubeless function
NAME    NAMESPACE       HANDLER         RUNTIME         DEPENDENCIES    STATUS
hello   default         test.hello      python3.7                       1/1 READY

$kubeless trigger kinesis list
NAME            NAMESPACE       REGION          STREAM          SHARD                   FUNCTION NAME
test-trigger    default         ap-northeast-1  k8stest   shardId-999                   hello

$aws kinesis describe-stream --stream-name k8stest
{
"StreamDescription": {
    "KeyId": null,
    "EncryptionType": "NONE",
    "StreamStatus": "ACTIVE",
    "StreamName": "k8stest",
    "Shards": (
        {
            "ShardId": "shardId-999"
        }
    ),

}

privacy – Definition of what is log data in contect of VPN

I am confused – do I get sth. wrong? Does it make any sense that a VPN provider (in this case VyprVPN, but this does not matter – please note that I do not ask for shopping advice or advice for/against this or another VPN provider, I am asking how to understand privacy claims in general, and what is consideres as “log” data) claims (accessed here 2021-05-16)

Exceptional Privacy … our no-log policy independently audited.

and then (at least honestly) admit (accessed here 2021-05-16)

…only collects a minimal amount of information when you
connect over our VPN product (VyprVPN), and only retains it for a
period of 30 days. We retain:

  • Customer’s source IP address (generally the IP address assigned by the customer’s ISP)
  • VyprVPN IP address used by the user
  • Connection start and stop time
  • Total number of bytes used

Wouldn’t keeping time, source and target IP address make the user’s online activity quite transparent (aka “log files” in their genuine meaning), and in clear contradiction (at least for 30 days) to a “no log” policy?

python – Read Data from a serial port and write to influxdb

I have a energy meter which sends the kWh count periodically every few seconds via a serial port.
To store this data I write the counter value and the calculated average power of the last 10s in a influxdb measurment.

To calculate the power i use a loop that compares the current counter value with the one 10s ago. To avoid that the loop is interrupted by reading the serial interface or sending the data via http i use serial_asyncio and aiohttp.

I am pretty new to Python and got my script running by doing serveral asyncio tutorials, so i am not sure if i mixed some old and new syntax.

The script is working, but I am not happy with a few things.

  1. Is there a better way to pass the counter value from the serial data_received function to my calc_delta function without the use of global?

  2. Am I using asyncio properly? I found different examples using asyncio.ensure_future() or loop.run_until_complete().

  3. Is the overall structure okay, what could i have done better?

import aiohttp
import asyncio
from datetime import datetime
import serial_asyncio
import serial
import time


value = None
last_value = 0
token = "myInfluxdbToken"


class Input(asyncio.Protocol):
    data_buffer = ""

    def connection_made(self, transport):
        self.transport = transport
        print("port opened", transport)

    def data_received(self, data):
        global value

        self.data_buffer += data.decode("utf-8")
        # find counter value in received string
        if "eHZ" in self.data_buffer:
            data_tmp = self.data_buffer.replace("rn", "")
            value_str = data_tmp.partition("1.8.1*255(")(2)(:11)

            if len(value_str) == 11:
                value = float(value_str)
                print(str(value))
                asyncio.ensure_future(send_data(value, "counter"))
            # Reset the data_buffer!
            self.data_buffer = ""


async def calc_delta():
    global value, last_value
    # calculate counter delta
    while True:
        delta = 0
        print(value)
        if value:
            delta = value - last_value
            last_value = value
        # 10s * 3600s/h * 1000W  Calculates Avg Power of last 10s from kWh delta
        deltaW = delta / 10 * 3600 * 1000
        if deltaW < 50000:
            await send_data(deltaW, "deltaW")
            print(time.perf_counter(), deltaW)
        else:
            print(time.perf_counter(), "data invalid")

        await asyncio.sleep(10)


async def send_data(data, tag):
    async with aiohttp.ClientSession() as session:
        async with session.post(
            "http://localhost:8086/api/v2/write?org=my_org&bucket=testdata&precision=s",
            data=f"ehzdata,type={tag} value={data}",
            headers={"Authorization": f"Token {token}"},
        ) as response:
            print("Status:", response.status)


loop = asyncio.get_event_loop()
serial_coro = serial_asyncio.create_serial_connection(
    loop, Input, "/dev/ttyUSB0", baudrate=9600, parity=serial.PARITY_EVEN, bytesize=7
)
counter_coro = calc_delta()
asyncio.ensure_future(serial_coro)
loop.run_until_complete(counter_coro)
loop.run_forever()
loop.close()

ip – Is it possible to send a PING without any data from a cisco router

Probably an odd question, I’m just working on a project where I’m using an application that only responds to pings without any data attached.

If I ping from this program the Cisco router will respond without any data, but in the other direction Cisco will add the data into the ping packet and the program will respond without any data.

In Linux the case is the same although, if I use ping 192.168.3.1 -s 0 I can specify that there is no data in the ping packet and the replies will work as I intend.

This may be odd, but I’m working with a project where the pings are hardcoded to send no data despite what the reply requires (Windows seems to not care, Linux cares and can be adjusted but I can’t find a way to tell the router to originate a ping with no data.)

open source – Is it possible to build an Android conformant calendar app which stores its data as shareable file?

Or is there already one?

I don’t know Androids calendar infrastructure so maybe my question sounds a bit fuzzy.

I dislike (non self hosted) cloud services but I like the ability to share my calendar among devices (and even people). Since I’m not alone I wonder whether someone already built a calendar app which stores it’s data (encrypted) in a way which can be shared via Syncthing, Sparkleshare, Dropbox, …
This app would have to ‘monitor’ changes on this file in order to update itself, notify the user about changes and not overwrite changes ‘from outside’.

So my question is actually two questions: is (are) there such apps already (and how are they called, so I can search myself) or is this possible in the first place (or why not?).

json – How to send data destined to be inserted into a database from a web browser application

Is there a standard way to send data entered from a user in a web form to the web application to be inserted into tables in a database.

On the server I have code which uses prepared statements to insert data into database tables. I am designing the flow of data from client to server and back again and wondered if there are standard ways to go about this?

My initial idea was, user enters data in fields and when user hits send button the data values are processed and converted into json to be sent to the server as a POST request, content-type: application/json (will use ajax). But the user could enter any old garbage and so what do I do about that? Do I base64 encode the data before converting to json?

I don’t mind the user saving garbage or odd characters in the database. I just have to be careful that I can get the data to the database without any bugs or security concerns.

Does the take data, base64 encode then convert to json idea sound good? Any alternatives?