mint class – mintclass crashing (asking me to log back in), with this workflow

If I follow the following procedure, then mint class logs me out. I used this one time before, and it is very useful. However this week it is not working.
It seems to be working on my desktop, but not on work iPad. I have tried selecting request-desktop-site, but this makes no difference, with this problem.

  • Choose a class
  • Show seating plan
  • Choose a student, anyone
  • Select points
  • Select advanced reporting
  • Select student (at this point I am asked to login again, and when I do it has lost track of where I am)
  • Select «year»
  • Select «tutor group»

linux – What should I use or need to ssh log to another computer assuming I have private and public key of a known host?

From PuTTY, I can ssh log to a remote server since my public key has been “registered” to this server.
By now I can access this server through only this computer running PuTTY.

However I’d like to gain access from my laptop running Linux debian 9.
How can I use key, the private one or the public one or both, to log to that server?
(without, if possible any configuration on the remote side)

javascript – Discord bot log

So,I created a discord.js bot and added the following into index.js:

> client.on("guildCreate", guild => {
>     const logsServerJoin = client.channels.get('757945781352136794');
>     console.log(`The bot just joined to ${guild.name}, Owned by ${guild.owner.user.tag}`);
>     client.channels.cache.get('channel id paste here').send(`The bot just joined to ${guild.name}, Owned by ${guild.owner.user.tag}`);
> 
>     var guildMSG = guild.channels.find('name', 'general');
> 
>     if (guildMSG) {
>         guildMSG.send(` Hello there! My original name is `Bryant`!n This bot created by **R 1 J 4 N#7686**n For more info
> type `/help`!n `Bryant - Official Server:`
> https://discord.gg/UsQFpzy`);
>     } else {
>         return;
>     } });
> 
> // Logs of the bot leaves a server and changed the game of the bot
> client.on("guildDelete", guild => {  
> client.channels.cache.get('757945781352136794').send(`The bot just
> left ${guild.name}, Owned by ${guild.owner.user.tag}`);
>     console.log(`The bot has been left ${guild.name}, Owned by ${guild.owner.user.tag}`);
>     logsServerLeave.send(`The bot has been left ${guild.name}, Owned by ${guild.owner.user.tag}`); });

It does not show any error in the terminal.It is supposed to log me where the bot joined and left in the mentioned channel but does not 🤷‍♂️.Can anyone help me out with this?

integration – What is the antiderivative of this double integral with log? $int int f(x,y) ln(f(x,y)) dx dy$

What is the antiderivative, and line-by-line derivation, of

$$int_0^1 int_0^1 f(x,y) enspace ln(f(x,y)) enspace dx enspace dy = enspace ?$$

Not sure if this would help, but could we generalize from the following well-known rule

$$int_{-infty}^infty x ln x dx = frac{1}{2}x^2 ln x – frac{1}{4} x^2 + C$$

postgresql – What’s with these empty log files? Why does csvlog mode create plaintext ones?

I’ve been fighting for days now to get logging set up. I’ve had to write a ton of code manually because PG doesn’t provide any automated mechanism to do this, for some reason, nor even tells you anything, beyond this.

I have:

  1. Set up the postgres_log table exactly like it says on that page.
  2. Set up my postgresql.conf like this (also as it says on the page, except it only describes it vaguely and lets me find out everything on my own):
log_destination = 'csvlog'
logging_collector = on
log_directory = 'C:\pglogs' # Yes, I require double  chars or else it removes them entirely...
log_filename = 'PG_%Y-%m-%d_%H;%M;%S'
log_rotation_age = 1min
log_rotation_size = 0
log_truncate_on_rotation = on
  1. Coded my own mechanism to constantly go through C:pglogs for any .csv file, skipping any ones that PG reports are already in use with pg_current_logfile, feed them into PG’s table and then delete the file. This took me a huge amount of time and effort and not a word about it was mentioned in that “manual”.

Questions:

  1. PostgreSQL creates both PG_2020-09-20_00;56;19.csv (in CSV format) and PG_2020-09-20_00;56;19 (in plaintext format) files. I obviously don’t want the extensionless files. Why are they created?
  2. Every minute (as specified) PG creates new log files, even if there’s nothing new to log. This results in an endless stream of empty log files (which my custom script goes through, “imports” and then deletes). How do I tell PG to stop doing that? It seems like pointless wear & tear on my disk to make empty files which are just deleted seconds later by my ever-running script.
  3. Why isn’t all of this automated? Why do I have to spend so much time to manually cobble together a solution to import the CSV files back into PG? In fact, why are they dumped to CSV files in the first place? Why doesn’t PG have the ability to directly log into that database table? It seems like a pointless exercise to dump CSV files which are only going to be COPYed back into the database and then deleted.

PHP API REST with Value Objects – How to log ALL exceptions to a Error array and render it to user?

I am developing an PHP API REST using Value Objects. I have some value objects as like: ID, Date, Name, etc. When they fail in their construction due to a invalid format or something it throws a InvalidArgumentException.

How can I “collect” all the Exceptions and when the script stops send them in an “error” array in the json response?

The problem is that i think that make hundred of try catch for each value object is not the best way, and I can not find the way to catch multiple exceptions in a try block.

Value Object that may throw an InvalidArgumentException

I have this poor elegant solution:

$errors = ();

try {
    $authorID = new ID('dd');
} catch(Exception $e) {
    $errors() = $e->getMessage();
}


try {
    $authorID = new ID('ff');
} catch(Exception $e) {
    $errors() = $e->getMessage();
}

Also if the VO are created succesfully, i want to create an object with them, example:

$Insertable = new Insertable($authorID);

How can do this?

If i have multiple ValueObjects that may throw Exceptions, how can I catch them all and make a response with these exceptions as “error” array?

Thanks!

postgresql – What’s with these pointless empty log files? And why does the csvlog mode create plaintext ones too?

I’ve been fighting for days now just to get god damn logging set up. I’ve had to write a ton of code manually because PG doesn’t provide any automated mechanism to do this, for some reason, nor even tells you anything, beyond this: https://www.postgresql.org/docs/12/runtime-config-logging.html#RUNTIME-CONFIG-LOGGING-CSVLOG

I have:

  1. Set up the postgres_log table exactly like it says on that page.
  2. Set up my postgresql.conf like this (also as it says on the page, except it only describes it vaguely and lets me find out everything on my own):
log_destination = 'csvlog'
logging_collector = on
log_directory = 'C:\pglogs' # Yes, I requires double  chars or else it removes them entirely...
log_filename = 'PG_%Y-%m-%d_%H;%M;%S'
log_rotation_age = 1min
log_rotation_size = 0
log_truncate_on_rotation = on
  1. Coded my own mechanism to constantly go through C:pglogs for any .csv file, skipping any ones that PG reports are already in use with pg_current_logfile, feed them into PG’s table and then delete the file. This took me a huge amount of time and effort and not a word about it was mentioned in that “manual”.

Questions:

  1. PostgreSQL creates both PG_2020-09-20_00;56;19.csv (in CSV format) and PG_2020-09-20_00;56;19 (in plaintext format) files. I obviously don’t want the extensionless files. Why are they created?
  2. Every minute (as specified) PG creates new log files, even if there’s nothing new to log. This results in an endless stream of empty log files (which my custom script goes through, “imports” and then deletes). How do I tell PG to stop doing that? It seems like pointless wear & tear on my disk to make empty files which are just deleted seconds later by my ever-running script.
  3. Why isn’t all of this automated? Why do I have to spend so much time to manually cobble together a solution to import the CSV files back into PG? In fact, why are they dumped to CSV files in the first place? Why doesn’t PG have the ability to directly log into that database table? It seems like a pointless exercise to dump CSV files which are only going to be COPYied back into the database and then deleted.

How to know how full the MySQL redo log is?

Setting innodb_log_file_size to a higher value improves write performance, but increases crash recovery time (or even restart time, if I understand correctly).

Is there a way to know, at any time, how full the redo log is? To get an idea of how long the crash recovery would take.

Subsidiary question: is there a way to force flushing the redo log, while MySQL is running?

How exactly does PostgreSQL expect me to read in/delete the CSV log files?

I’ve been following the (very sparse and cryptic) instructions here: https://www.postgresql.org/docs/12/runtime-config-logging.html#RUNTIME-CONFIG-LOGGING-CSVLOG

  1. I’ve set up the postgres_log table exactly like it says on that page.

  2. I’ve set up my postgresql.conf like this:

    log_destination = ‘csvlog’

    logging_collector = on

    log_directory = ‘C:pglogs’

    log_filename = ‘PG_%Y-%m-%d_%H;%M;%S’

    log_rotation_age = 1d

    log_rotation_size = 0

    log_truncate_on_rotation = on

  3. I’ve restarted PostgreSQL, and it has created a PG_2020-09-20_00;56;19.csv and PG_2020-09-20_00;56;19 file.

  4. I am able to successfully run a COPY query to import the PG_2020-09-20_00;56;19.csv into my database table, if I explicitly name it.

My problems:

  1. How am I supposed to determine which filename(s) to pick to COPY into the table from my automated, regularly run script? (Since it can’t be the "current" one.)
  2. After I have somehow determined which filename(s) are safe to COPY in, and I’ve loaded them into my table, am I expected to delete these myself?
  3. What’s with the plaintext-format PG_2020-09-20_00;56;19 file? Why is that created when I clearly tell PG to use CSV?

None of this is addressed on the page I linked to and which I’ve been following.