javascript – How to turn nodeJS scraper results into a CSV file?

I’m using the nodeJS scraper below that scrapes the google play store. However, I’d like the results to show up in a CSV file. Currently it just runs in console:

https://github.com/facundoolano/google-play-scraper

Secondly, I would like to somehow combine the “search” and “app detail” functions so that I could search a term and return the App title, developer name, app URL, and developer email.

Search function:

var gplay = require('google-play-scraper');

gplay.search({
    term: "panda",
    num: 2
  }).then(console.log, console.log);

App Details Function:

var gplay = require('google-play-scraper');

gplay.app({appId: 'com.google.android.apps.translate'})
  .then(console.log, console.log);

Design relation for relational database (to import from below csv table) to query using REST API

I have this table where I am trying to make relationship between exercise and muscle group to find list of exercises for a muscle group, and also to request a list of muscle groups for an exercise. However I am having trouble to find what sort of keys should I use. Was thinking nested approach but that might be an overkill. Any suggestions/thoughts?

enter image description here

I wanted to create a separate table for boolen/junction table with bool valus, YES/NO but not sure how to leverage on this.

enter image description here

exchange rate – Is there a webpage that provides a JSON or CSV or VCALENDAR file for Bitcoin price predictions?

Various people have made various Bitcoin price predictions, and continue to make them. There are webpages which talk about them in English text, but where is one which has it in a computer-parseable format?

I want such a thing so that I can automatically insert these predictions into my calendar and “Bitcoin overview” thing which is a big table that I stare at every day with Bitcoin prices.

That is, something like:

2021-03-04,Citibank predicts $100,000
2021-05-04,Elon Musk predicts $120,000
2021-08-11,Kevin O'Leary predicts $380,000
2022-01-01,Goldman Sachs predicts $18,000,000

Does such a thing exist? It would be very nice to always be able to “look ahead” to see what various people/entities predicted. Some kind of community effort or dedicated person keeping this up to date as new predictions come in, etc. I’m not going to sit and read web articles talking about predictions, but if I had this data like this, or something similar, I would have a chance to follow these predictions and look forward and hope for them to come true.

magento2 – How can i import bulk products with CSV having 1 lakh in PHP

tried with below code and saved only 10k records, how can i import with batch size?

if(isset($_POST(“Import”))){

$filename=$_FILES("file")("tmp_name");    
 if($_FILES("file")("size") > 0)
 {
    $file = fopen($filename, "r");
      while (($getData = fgetcsv($file, 100000, ",")) !== FALSE)
       {
         $sql = "INSERT into employeeinfo (emp_id,firstname,lastname,email,reg_date) 
               values ('".$getData(0)."','".$getData(1)."','".$getData(2)."','".$getData(3)."','".$getData(4)."')";
               $result = mysqli_query($con, $sql);
    if(!isset($result))
    {
      echo "<script type="text/javascript">
          alert("Invalid File:Please Upload CSV File.");
          window.location = "index.php"
          </script>";    
    }
    else {
        echo "<script type="text/javascript">
        alert("CSV File has been successfully Imported.");
        window.location = "index.php"
      </script>";
    }
       }
  
       fclose($file);  
 }

}

c# – Import Xml and Csv files into database

I have a simple service class here which import csv or xml file into database using .NET Standard library C#.

Do you have any comments? Are there any recommended techniques to use instead of the switch statement in the method ProcessPaymentFile?

public class AbcPaymentService : IAbcPaymentService
{
    private readonly IAbcPaymentContext _abcPaymentContext;
    private readonly IConfiguration _configuration;
    public AbcPaymentService(IAbcPaymentContext abcPaymentContext, IConfiguration configuration)
    {
        _abcPaymentContext = abcPaymentContext;
        _configuration = configuration;
    }

    public List<PaymentTransactionDetailResponse> GetTransactionsByCurrency(string currency)
    {
        var paymentTransactions = _abcPaymentContext.PaymentTransactions.Where(p => p.CurrencyCode == currency).ToList();

        return MapPaymentTransactions(paymentTransactions);
    }

    public List<PaymentTransactionDetailResponse> GetTransactionsByDateRange(DateTime dateFrom, DateTime dateTo)
    {            
        var paymentTransactions = _abcPaymentContext.PaymentTransactions
            .Where(p => p.TransactionDate >= dateFrom && p.TransactionDate <= dateTo).ToList();

        return MapPaymentTransactions(paymentTransactions);
    }

    public List<PaymentTransactionDetailResponse> GetTransactionsByStatus(string status)
    {
        // add more validation. ie. check length. 

        var paymentTransactions = _abcPaymentContext.PaymentTransactions.Where(p => p.Status == status).ToList();

        return MapPaymentTransactions(paymentTransactions);
    }

    public void ProcessPaymentFile(IFormFile file)
    {
        #region Validation
        var fileExtension = Path.GetExtension(file.FileName);

        var validFileTypes = new() { ".csv",".xml"}; // move to appsetting for easier configuration.            
        bool isValidType = validFileTypes.Any(t => t.Trim() == fileExtension.ToLower());

        if (isValidType == false)
            throw new ArgumentException($"Unknown format.");

        if(file.Length > 1000) // move to appsetting for easier configuration.
            throw new ArgumentException($"Invalid file size. Only less than 1 MB is allowed.");
        #endregion

        // Upload file to server
        var target = _configuration("UploadPath");

        var filePath = Path.Combine(target, file.FileName);

        try
        {
            using (var stream = new FileStream(filePath, FileMode.Create))
            {
                file.CopyTo(stream);
            }

            switch (fileExtension.ToLower())
            {
                case ".csv":
                    var config = new CsvConfiguration(CultureInfo.InvariantCulture)
                    {
                        HasHeaderRecord = false,                            
                    };
                    using (var reader = new StreamReader(filePath)) {
                        using (var csv = new CsvReader(reader, config))
                        {
                            csv.Context.RegisterClassMap<CsvMap>();
                            var paymentTransactionCsv = csv.GetRecords<Models.Xml.Transaction>().ToList();

                            SaveToDb(paymentTransactionCsv);
                        }
                    }                        
                    break;
                case ".xml":
                    var serializer = new XmlSerializer(typeof(Models.Xml.Transactions));

                    using (TextReader reader = new StreamReader(new FileStream(filePath, FileMode.Open)))
                    {
                        var paymentTransactionXml = (Models.Xml.Transactions)serializer.Deserialize(reader);

                        SaveToDb(paymentTransactionXml.Transaction);
                    }
                    break;
                default:
                    throw new ArgumentException($"Invalid file type. Only {string.Join(",", validFileTypes)} allowed.");
            }
        }
        catch (Exception ex)
        {
            throw new Exception(ex.Message);
        }            
    }

    #region PrivateFunctions
    private void SaveToDb(List<Models.Xml.Transaction> paymentTransactions)
    {
        if (PaymentTransactionXmlIsValid(paymentTransactions) == false)
            throw new Exception("Invalid transaction."); // todo: write into log file or db

        // if all validation passed, map objects and 
        var paymentTransactionsEntity = paymentTransactions.Select(p => new PaymentTransaction()
        {
            TransactionId = p.Id,
            TransactionDate = p.TransactionDate,
            Amount = Convert.ToDecimal(p.PaymentDetails.Amount),
            CurrencyCode = p.PaymentDetails.CurrencyCode,
            Status = Mapper.MapStatus(p.Status)
        })
        .ToList();

        // save into db.
        _abcPaymentContext.PaymentTransactions.AddRange(paymentTransactionsEntity);
        _abcPaymentContext.SaveChanges();

        // todo: don't insert duplicate transaction
    }

    private bool PaymentTransactionXmlIsValid(List<Models.Xml.Transaction> paymentTransactions)
    {
        foreach (var trans in paymentTransactions)
        {
            if (string.IsNullOrEmpty(trans.Id)) return false;
            if (trans.TransactionDate == null) return false;
            if(trans.PaymentDetails.Amount == 0) return false;
            if (string.IsNullOrEmpty(trans.PaymentDetails.CurrencyCode)) return false;
            if (string.IsNullOrEmpty(trans.Status)) return false;             
        }
        return true;
    }

    private List<PaymentTransactionDetailResponse> MapPaymentTransactions(List<PaymentTransaction> paymentTransactions) {
        // Construct Dto responses model.
        var paymentTransactionDetailResponses = paymentTransactions.Select(p => new PaymentTransactionDetailResponse()
        {
            Id = p.TransactionId.ToString(),
            Payment = $"{p.Amount} {p.CurrencyCode}",
            Status = p.Status
        }).ToList();

        return paymentTransactionDetailResponses;
    }
    #endregion
}

pandas – How can I return row for cell if the cell data matches another cell from a different CSV file? Python

I want to compare perceptual hash values that are stored in CSV files: a CSV for each type of modification performed on the image and a CSV for the originals.
How can I extract the hash values from the CSV file when the filename from the Original matches the filename from the modified images CSV?

It is my goal for each hash value for each image in the Original CSV dataset, compare each hash value from each of the other CSV files when and only when the filenames match. So far, I have the filenames matching, but I want to access the data stored in the other columns.

    originalImages_path = 'FullPath\Python Project\Images\Original'
    path = 'FullPath\Python Project\Images\'
    fields = ('Filename', 'aHash', 'pHash', 'wHash', 'DHash', 'ColorHash')
    originalImages_CSV = pd.read_csv(originalImages_path + '\Originalimagehashes.csv', usecols=fields)


    for AHash_Original in originalImages_CSV.Filename:
        AHash_OriginalStrip = AHash_Original.replace('Original', '').split('.')

        for folder in os.listdir(path):
            if folder != 'Original':
            
                for file in os.listdir(path + folder):
                
                    if file.endswith('.csv'):
                        modifiedImagesCSV = pd.read_csv(path + folder + '\' + file, usecols=fields)
                    
                        for modifiedImage in modifiedImagesCSV.Filename:
                            modifiedImageName = modifiedImage.split('_')

                            if AHash_OriginalStrip(0) == modifiedImageName(0):
                                print(f'MATCH! {AHash_Original} and {modifiedImage}')

The above code shows me that I can successfully match each original image to its modified version in each of the modified CSVs. I am unsure where to go from here and have been struggling with the logic behind it. The code above is just for aHash, however there are 4 other hash algorithms I have stored and would like to compare.

8 – General error: 2 File ‘/code/sites/default/files/myfile/myfile.csv’ not found (Errcode: 2) insert csv record in DB

I am getting an error while inserting CSV records in the database. It was working fine before, but now it doesn’t. I am using MySQL 10.0.23-MariaDB-log.

DrupalCoreDatabaseDatabaseExceptionWrapper: SQLSTATE(HY000): General error: 2 File ‘/code/sites/default/files/myfile/myfile.csv’ not found (Errcode: 2): LOAD DATA LOCAL INFILE ‘/code/sites/default/files/myfile/myfile.csv’ INTO TABLE pantheon.tablename FIELDS TERMINATED BY ‘,’ ENCLOSED BY ‘”‘ LINES TERMINATED BY ‘ ‘ IGNORE 1 ROWS; Array ( ) in DrupalmodulenameFormmyUploadForm->submitForm() (line 56 of /code/modules/custom/modulename/src/Form/myUploadForm.php).

This the code causing the error.

$database = Drupal::database();
$query = $database->query("TRUNCATE TABLE tablename");
$query = $database->query("LOAD DATA LOCAL INFILE '$file_server_path'
  INTO TABLE `pantheon`.`tablename`
  FIELDS TERMINATED BY ','
  ENCLOSED BY '"'
  LINES TERMINATED BY 'n'
  IGNORE 1 ROWS");

sql server – File is getting generated into folder, But not a csv file

For the below SP, normal text file is getting generated instead of csv file.

ALTER PROCEDURE (dbo).(GenerateCDR_Test)
AS
BEGIN
DECLARE @FileName varchar(50),

@bcpCommand varchar(2000)

SET @FileName = ‘C:TransactionsData’ + (‘Tansactions_’+ (CONVERT(VARCHAR, GETDATE(), 112) + ‘‘ + CAST(DATEPART(HOUR, GETDATE()) AS VARCHAR) + ‘‘ + CAST(DATEPART(MINUTE,GETDATE()) AS VARCHAR) + ‘_’ + CAST(DATEPART(SECOND, GETDATE()) AS VARCHAR)) + ‘.csv’);

SET @bcpCommand = ‘BCP ‘+'”SELECT * FROM (LinkVisaCard).(dbo).(CDRGeneration)”‘ +’ queryout ‘

SET @bcpCommand = @bcpCommand + @FileName + ‘ -U abc -P password -w’

EXEC master..xp_cmdshell @bcpCommand
END

import – Formatting Imported Complex Arrays from Python (csv. files)

Just a short question, I am trying to import a complex array (matrix) from Python to Mathematica by first writing to a csv. file. When I Import the csv. array to Mathematica it is in a form (matrix elements are wrapped in round brackets with imaginary part denoted with Python imaginary part symbol ‘j’) that is not compatible with Mathematica as indicated by the code example (2×2 matrix) below:
The matrix $A$ that I am importing as an example is:
begin{pmatrix}
i &2 \
3 & 4
end{pmatrix}

A = Import("C:\Users\JohnDoe\Documents\PycharmProjects\pythonProject\foo.csv", "Data")

where the output of imported array A in Mathematica is:

{{(0.00000000000000000e+00+1.00000000000000000e+00j),(2.00000000000000000e+00+0.00000000000000000e+00j)},
{(3.00000000000000000e+00+0.00000000000000000e+00j), (4.00000000000000000e+00+0.00000000000000000e+00j)}}

Can anyone advise on how to process the array after importing such that it is in a standard Mathematica form without brackets (and standard Mathematica imaginary part)?

Thanks for any assistance.