reactjs – TypeError: Object is not iterable (cannot read property Symbol(Symbol.iterator)) useContext

Trying to test using context and states, and getting this error.
The error occurs with the line of code below that has useContext(InventoryContext). I have already wrapped all of my components with InventoryProvider.

const AddInvItem = () => {
  const (name, setName) = useState("");
  const (price, setPrice) = useState("");
  const (products, setProducts) = useContext(InventoryContext);

  const updateName = (e) => {
    setName(e.target.value);
  };
  const updatePrice = (e) => {
    setPrice(e.target.value);
  };

  const addItem = (e) => {
    e.preventDefault();
    setProducts((prevProducts) => (
      ...prevProducts,
      { productName: name, purchasePrice: price },
    ));
  };
  return (
    <form onSubmit={addItem}>
      <input
        type="text"
        name="productName"
        value={name}
        onChange={updateName}
      />
      <input type="text" name="price" value={price} onChange={updatePrice} />
      <button>Submit</button>
    </form>
  );
};

export default AddInvItem;

Context:

import React, { useState, createContext } from "react";

export const InventoryContext = createContext();

export const InventoryProvider = (props) => {
  const (products, setProducts) = useState((
    {
      productName: "test",
      purchasePrice: "$0",
      id: 1,
    },
  ));
  return (
    <InventoryContext.Provider items={(products, setProducts)}>
      {props.children}
    </InventoryContext.Provider>
  );
};

Any ideas?

Investigation about read errors on my SSD

Definitely more than a year ago I replaced the internal 1TB Apple SSD in my MacBook Pro (Retina, Mid 2015) with a Samsung SSD 970 EVO 2TB unit (and an adapter). Quick story, no problem so far – SMART stats say 47TB data unit reads and 27TB data unit writes.

Since a couple of days I’m seeing some sporadic failure in reading a few files. SMART “Media and Data Integrity Errors” count has increased. The problem is sporadic because if I test the file that triggered the error (e.g. by computing a MD5 from command line) I see “Device error” at first, but it then becomes ok after retrying.

Is it possible to understand how serious is it?

navigation – Some pages hide content unless “Read more” is clicked

What you’re probably looking at is a “View”. This view is probably showing a list of “Content” (nodes).

Now each node is probably being set to render in “Teaser” view mode. But since some of your node types (article for example) doesn’t have a “teaser” view mode configured, it shows the “Full” (or “Default”) view mode, which renders the “whole” thing.

What you need to do is find the View Mode that your content listing View is using, the Node Type of the content that’s showing up as Full, (for example ‘Basic Page’), then go to:

Structure > Content Types > (Content Type you want to configure) > Manage Display

And configure the view mode that your View is using to display the Read More link, show the body text as “Summary or Trimmed” etc.

I will write and publish UNIQUE guest post On READ WRITE BLOG DA-58 for $10

I will write and publish UNIQUE guest post On READ WRITE BLOG DA-58

What you will get with this Service?

I will give you a DOFOLLOW and HIGH AUTHORITY DA-58 Site.

What will you get?
✓ I will write an article of 400 words
I will submit an article on high-quality Guest Post,

✓ Permanent Post,

✓ Links are Created Manually and Safely,

✓ 100% Satisfaction Guaranteed

Do-Follow Backlink,

The metrics of the website is as follows:

Domain Name: Readwriteblog.com

Domain Authority: 58

Page Authority: 40

If you have any questions or want to see the website, Inbox me. I will get back to you asap.

.

How exactly does PostgreSQL expect me to read in/delete the CSV log files?

I’ve been following the (very sparse and cryptic) instructions here: https://www.postgresql.org/docs/12/runtime-config-logging.html#RUNTIME-CONFIG-LOGGING-CSVLOG

  1. I’ve set up the postgres_log table exactly like it says on that page.

  2. I’ve set up my postgresql.conf like this:

    log_destination = ‘csvlog’

    logging_collector = on

    log_directory = ‘C:pglogs’

    log_filename = ‘PG_%Y-%m-%d_%H;%M;%S’

    log_rotation_age = 1d

    log_rotation_size = 0

    log_truncate_on_rotation = on

  3. I’ve restarted PostgreSQL, and it has created a PG_2020-09-20_00;56;19.csv and PG_2020-09-20_00;56;19 file.

  4. I am able to successfully run a COPY query to import the PG_2020-09-20_00;56;19.csv into my database table, if I explicitly name it.

My problems:

  1. How am I supposed to determine which filename(s) to pick to COPY into the table from my automated, regularly run script? (Since it can’t be the "current" one.)
  2. After I have somehow determined which filename(s) are safe to COPY in, and I’ve loaded them into my table, am I expected to delete these myself?
  3. What’s with the plaintext-format PG_2020-09-20_00;56;19 file? Why is that created when I clearly tell PG to use CSV?

None of this is addressed on the page I linked to and which I’ve been following.

xcp ng – VM Not Booting XCP-NG failure could not read boot disk

I have a server with a few VM’s first VM’s i setup using the main lopcal storage which was an SSD. I added a few HDD via RAID etc and thought it would be simple enough to add another storage device.

I added the storage with below command:

xe sr-create content-type=user type=lvmohba device-config:device=/dev/sdc shared=false name-label=”HDD6TB”

This shows up and cant create a VM with the new disk however upon booting it just seems to throw error “boot device: CD-Rom medium detected – failure could not read boot disk” and then powers off.

Not sure if it is my storage or anything but have tried on a few disks and have the same issue. Changed boot order and restarted a few times. Recreated partitions however still no luck on adding any extra drives other than the one it was originally installed on, what am i doing wrong here?

javascript – Can’t read [object Text] using createtextnode()

I’m trying to get the taskNodeValue but instead I get (object Text), the taskValue is a input text in html:

<input type="text" id="taskInput" placeholder="add task">

and here is the javascript code:

let allTasks = ();
        let task = {
            Id: taskId,
            taskValue: document.getElementById('taskInput').value,
        }
        allTasks.push(task);

        let p = document.createElement('p')
        p.classList.add('toDoItem');
        let taskValueNode = document.createTextNode(taskArray(id).taskValue);
        p.appendChild(taskValueNode)
        let div = document.createElement('div')
        div.classList.add('buttons')
        let buttonsNode = createButtons(id).editButton+createButtons(id).deleteButton;
        listItem.innerHTML = taskValueNode+buttonsNode;
        listItem.id = id;
        toDo.appendChild(listItem);

How do I make a chart read data in a cell that’s separated by comma in Google Sheets?

I’ve added two sheets to your sample spreadsheet: “FormQ3” (short for “Form – Question 3”) and a duplicate of your Charts renamed “Charts-Erik.”

First, never mess with raw form data. It is standard practice to set up a separate sheet or sheets (e.g., my FormQ3) to handle processing data.

If you’ve set up your form to send over multiple, comma-separated answers, Sheets just sees those as one long string. Consequently, charts can only see them as solid strings.

FormQ3 holds one array formula in A1 that separates out the comma-separated lists, along with a few added “bells and whistles” to deal with anomalies:

=ArrayFormula({"Alt Games List";QUERY(IF(LEN(FLATTEN(TRIM(SPLIT(FILTER('Form Responses 1'!C2:C,'Form Responses 1'!C2:C<>""),",",0,1))))<=25,FLATTEN(TRIM(SPLIT(FILTER('Form Responses 1'!C2:C,'Form Responses 1'!C2:C<>""),",",0,1))),"Other"),"Select * Where Col1 Is Not Null")})

Working from the inside out, first Column C of your raw data is FILTERed to only receive non-null rows. SPLIT splits these at the comma. TRIM removes any extra spaces that would have existed after commas. FLATTEN (which is an as-yet-undocumented Google Sheets function) takes all of these and forms one column from them. IF checks to make sure that the LENgth of each of these is not more than 25 characters (which I figured was safe for any real game name, but would rule out phrases like one person put in there); IF the LENgth is <=25, the processed game name is added to the list (which is why you see a big block of the formula repeated within itself; it just means “list the processed version”). Anything going over 25 characters is figured not to be game name, so it will just list “Other.” Then, a QUERY is run on all of that to get rid of any remaining blank rows in memory.

This list is then used to create your second chart (which you can see in “Charts-Erik”).

BTW, if you wanted to be hands-on and go through every submission to cull out game names from long phrases (like your raw data C4 where someone added “If you know of the game Stormworks.”), you can just directly edit the raw-data sheet to replace “If you know of the game Stormworks.” with “Stormworks” in the list. This way, that game name would be added to the FormQ3 list and the chart. You can’t, however, edit the FormQ3 list directly, because it is formula-created. If you try typing in the list, you will break the formula and wind up with an error.

gcloud – Google Cloud Shell Terminal: Unable to read file [Errno 36] File name too long for PEM formatted (self managed) certificate

I have set up a load balancer in google cloud via the GUI, now I want to set it up via gcloud shell.
During the step to set up a (self managed) certificate, I get a message from the gcloud shell:

ERROR: (gcloud.alpha.compute.ssl-certificates.create) Unable to read file
(—–BEGIN CERTIFICATE—–mycertificate—–END CERTIFICATE—–): (Errno 36) File name too long: ‘—–BEGIN CERTIFICATE—–mycertificate—–END CERTIFICATE—–‘.

I got the command from the GUI on the page
https://console.cloud.google.com/net-services/loadbalancing/advanced/sslCertificates
generated. While generating there is still a typo, –priavteKey is used instead of the correct –private-key. I have changed this manually in my command.
I can see this problem is in the stable, beta and alpha version of the command gcloud “compute ssl-certificates create”.
Am I doing something wrong, or could this be a bug either in generating the command from GUI or read the parameter data from command?

For this post I’ve created a self singed certificate and used it’s data, so the private key is not confidential. This is the complete glocud command that I used:

gcloud compute ssl-certificates create certificate-rsa2048 --project=my-project --global --certificate=-----BEGIN CERTIFICATE-----$'n'MIIDDzCCAfegAwIBAgIUZbQ2aFWcrxsv3y7a6QLGqQ1vBJkwDQYJKoZIhvcNAQEL$'n'BQAwFzEVMBMGA1UEAwwMbGludXgtc2VydmVyMB4XDTIwMDkxMzE1MTcyNVoXDTIx$'n'MDkxMzE1MTcyNVowFzEVMBMGA1UEAwwMbGludXgtc2VydmVyMIIBIjANBgkqhkiG$'n'9w0BAQEFAAOCAQ8AMIIBCgKCAQEApQiXzzqMwNEhzuc20R4K4IM6IDaPS2THacCz$'n'2j0DSRAkmospS4a3lLbQ1Hcn8kGXB/fWCYPq7n/UWQww5gQYX3KLrqSh696CH8mJ$'n'receaZ1fAp8hXEbGuELQoFhD68kAyPGP7NvpthYRmp/Ydxw4U3nk2XzfuiLtUo3R$'n'WzFdJMYIzQd2A5V+CAs/juIVegB0hYTryqMwiDkz4xhn1B4kJPDBsf3kPXM2leKg$'n'DEZy8PdhcF2RlM/1+lN5F/3rm17iAgYxqZrSAFSWcg7KMRjWXegi8P6ht4ehNgZG$'n'eqSQlYkvEZRssHoP9ryzO+JQiXszyskyl+D51N4dFD6UCGzN8wIDAQABo1MwUTAd$'n'BgNVHQ4EFgQUzcurlkW9rIdyDA0fbzZ972Y6TeIwHwYDVR0jBBgwFoAUzcurlkW9$'n'rIdyDA0fbzZ972Y6TeIwDwYDVR0TAQH/BAUwAwEB/zANBgkqhkiG9w0BAQsFAAOC$'n'AQEAUz4JfbKbMUxorO3KFFZozThM+6oC1MVNs7xUk5NLxQ4PPQ/Y5jeJdYhuLFfg$'n'EYAKKHAOglh2LS+pUbONTj1HzUDOrXzPjvIzuTlnq0sp+mc193GaeyeVYwe8+AR2$'n'UDY/XMRQzGj8JJ5+/SrqGOS7mZ4b4LSl6KeHrCkVr2laa5D8bVywBnG0h+sp5jMm$'n'BzuwCX/F6tlzu2cg6oR81Z5xSbEcWxV/pVkg/fja1fOU8q1ECBa5ai6yN/7UWtHS$'n'jFQajIiSv6uCGH9tbarD5tWRqTyZliUoWabtgSMnmRFi/1/70Zo0NpbLOJca0xL8$'n'z7qPX8SCJeGqV8tpYb4dYuYQFg==$'n'-----END CERTIFICATE----- --private-key=-----BEGIN PRIVATE KEY-----$'n'MIIEvQIBADANBgkqhkiG9w0BAQEFAASCBKcwggSjAgEAAoIBAQClCJfPOozA0SHO$'n'5zbRHgrggzogNo9LZMdpwLPaPQNJECSaiylLhreUttDUdyfyQZcH99YJg+ruf9RZ$'n'DDDmBBhfcouupKHr3oIfyYmt5x5pnV8CnyFcRsa4QtCgWEPryQDI8Y/s2+m2FhGa$'n'n9h3HDhTeeTZfN+6Iu1SjdFbMV0kxgjNB3YDlX4ICz+O4hV6AHSFhOvKozCIOTPj$'n'GGfUHiQk8MGx/eQ9czaV4qAMRnLw92FwXZGUz/X6U3kX/eubXuICBjGpmtIAVJZy$'n'DsoxGNZd6CLw/qG3h6E2BkZ6pJCViS8RlGyweg/2vLM74lCJezPKyTKX4PnU3h0U$'n'PpQIbM3zAgMBAAECggEANQPeqK55WtOT1cCG5oSNH/Rn7sM4IxMb0EgcPqZM8OKj$'n'r5W2zVFYlghoa2hfx730Q5YFBwd+p+EemQCGkM2N+tN0NcUjUv8mtAltFhVhurHY$'n'PKJb+CWwXq7wECJqp0rp2qNnBcLbgCf0vcRNqG3DVmWY85je0g+4R0XRlEb2UgJb$'n'6zpLPefYKJrFAuuqxVIDByhbZmmT8DcSfcaaVsxvwWYgOwKBfB/ekZ9FM7MrYpve$'n'DPJvlvrTsbXtT9kEXDcszJO+qriJtvNh239EP9RiVr1VpmECgOrYianyITbP/XpI$'n'EWPHNJ1PMOKoS79+eePkprBPGKZUgy4unMhQf9MrEQKBgQDRE+3hksXvq7bZ9j2G$'n'eJtsick1yPTwVUCGhVfYEx9eI+6DFHxXgeTaumL52+E2tjLdlugkziKcsNeJU0Ib$'n'7Gfg3oY2A4JsA0C4GuU0P07SU1bLAFuPIJl/NvGdqLzmV8uWuGoacSEQ3gbso7bf$'n'yRmeMrQJzXvWvpte/onCofJPyQKBgQDKEjRKCFIXM5IpPSTyBdLONcqi6yL+HPll$'n'8QSqx7wFinPI97kERJF5t0Ow/9x5HcaP4LMzxzHoFcXtund+u6D8Zj0HOz6o8lPC$'n'JLXBMhgzf9pVf9W1GAbDU4ZIc6F9Bw4DaDV27cLpG/iBp9ihKsZbq4CX+u6HTT0j$'n'riRt4gal2wKBgQCQTODxvws9z5Xz+S+dj8A1uSNWK8xh03UlYwKt7wTHRKVpcsoe$'n'21fIXrvRFyDpUfbpWS3/uQXKf4NDFGOcQh2v5eVbgjCRu+z/JBDtboRIRZyxnx2S$'n'Oz21v4Yi+kLl99JbxAv3E/1uVs8QpC2jZnh35ya7XUcLf6JcffE2k+9ZgQKBgBru$'n'yDdh2ocrIX0LoEP90LYZZ1PFoVlbRUZ4FFYq3v2iEYKkue0+smEVsxkBUUJ3XILK$'n'wTfSge1cEZB4/PpQScm6WsH+/IAKJG3I9My2P5GFpfUlX7eOZ0Bbfpdjig+fBBgi$'n'KJYcZJErYDjvxSjeCagoOLCJCofQHKkHyeU3bglnAoGAewgE3R4eg8c0KKhosNTV$'n'5JsLUmZUvStM8RBkgcP5wCkM84I/xuhms81D6g+129bxgAzZCDky4vsWPcaLnuPe$'n'wBccBz6FBiySzagvcHjk0Cu05sksVsC0WVU/Udwz41ZyVrPP/pXQzdUMtINKVSGr$'n'Mm46gcuqLdrDMt/vrmmGYcg=$'n'-----END PRIVATE KEY-----

Screenshot of gcloud shell error