ssis – Install Oracle 32 or 64 bit on Windows Server 2019 to import data into SQL Server and use with Power BI

Good afternoon everyone

Post for the first time, so excuse me if I make mistakes.

I'm a newbie to oracle and I'm really struggling with this concept of 32 and 64 bit oracle business.

All I really want to do is import data from an Oracle 12c DW into a local 2019 SQL Server daily and then connect Power BI over it.

The reason for this is that we are trying to switch from Oracle to SQL Server and I have to PoC.

So far I've looked up and down and can't figure out WHAT to install to access WHAT in Oracle.

All I really need, if I understand it correctly, is to install an Oracle CLIENT so that I can use SSIS to create my ETL routines. BUT when I install a 32-bit client it says that a 64-bit client is needed and when I install a 64-bit client it means that an ODAC is needed. I installed that too and still have no success.

I also have problems understanding which folder level is included in the PATH in the system environment variables.
It's so complicated compared to SQL Server – so I could really use your help in the following questions.

  1. Given that SSMS is 32-bit – Can I only install 32-bit Oracle client and ODAC – If so, what software do I download and install from Oracle – Can't tell the difference between xcopy and other versions ( and can you tell me how the path is updated)?

  2. Do I have to install x64-bit at all since I am on a 64-bit workstation? If so, what software do I download?

My ultimate goal is to be able to
– Connect to ORACLE with SQL SERVER SSIS and import data daily via ETL routines
– As a temporary backup, also establish a connection to ORACLE via Power BI.

Any help and advice much appreciated.

Many thanks

ssis – Problem with SQL principles in the report server and in the company web portal

I've had problems with the report server for the past few days. I actually created a package and deployed it to the SQL Server. Then I created a job for it. I called this job in the SSR report. The report works fine while it is running normally, but the problem is that after the report is deployed to the report server and the corporate web portal, the following error appears ("An error occurred during report processing. (RsProcessingAborted)
The query for the spjob dataset failed. (rsErrorExecutingCommand)
Cannot be run as a server principal because the "sa2" principal does not exist, this principal type cannot be considered impersonation, or you do not have authorization. ")
I know this is an authorization error, but I checked everything on the SQL Server but found no solution. Please help with this problem

SQL Server – The SSIS Script transformation package is not compiled when deployed to dev [Visual Studio 2017]

I have Visual Studio 2017.

I have a script transformation that I can easily run locally. However, when I deploy and run it in the Integration Services catalog on the development server, it fails with the following error:

& # 39; Package Name & # 39 ;: Error: Error compiling the in the
Package. Open the package in SSIS-Designer and solve the problem
following compilation errors.

So if I connect to the developer remotely and recreate the package, it will work fine the next time I run it.

I thought this could be a reference problem, but after reinstalling Visual Studio 2017 and SSDT, the references on both computers are exactly the same.

What could be the problem for that?

Replace the character-derived column SSIS

So I get CSV files and for some reason they come out as "123-45-9873".

Now I'm cutting out the "-", but the column is an SSN column, so "12345987 takes out the 3 and puts the" sign on the table.

Which replacement function can I use to replace the exact character?

For reference, this is what I'm doing

(DT_STR,9,1252)REPLACE(SSN,"-","")

and get "12345678

SSIS error – [SQL query execution task] Error: The binding name of the results must be zero for the entire result set and the XML results

I have an SQL task running that returns "Full Result Set". Nevertheless, the task fails. Please how can I fix this?

[SQL query execution task] Error: The binding name of the results must be zero for the entire result set and the XML results.

SQL Server – SSIS Package Deployment. Changing the protection level takes a long time

We are currently migrating our servers to a new data center. Currently, SQL Server 2014 and SSIS packages are deploying in seconds.

The new data center has several SQL Server 2016 installations and I've found that "changing the protection level" takes a long time for some (about 7 minutes!), But for some it's fast. If I change the SSIS project TargetServer version to 2016, changing the protection level on all new servers is slow.

The protection level of the project and the packages has been (and is) set to DontSaveSensitive.

What is the cause of the slowness and how can I fix it? It is finally provided without error.

SQL Server SQL Agent job fails for SSIS execution "An invalid report option was specified, only E, W, I, C, D, P, V, and N are allowed."

This error can occur if package parameters are specified on the Configuration tab in the job step, and one of the parameters (for example, Password) contains a character that needs to be delimited. B. a single quote. If the parameters in the Job Step dialog box are changed, the parameters are added to the command line of the job step, and a single or double quote is not limited. This will result in an invalid (or worse, erroneous) command line.

Possible solutions are:

  • Delete the job step, create a new job step and specify the package parameters directly in the SSIS project (in SSMS under Integration Services, SSISDB, SSIS packages, projects). The command line for the job step does not try to pass the parameters, but the package execution engine reads the parameters directly and can handle special characters better.

  • It is possible (but I have not tested it) to insert the backslash delimiter before the special character. For a password with My & # 39; Pa $$ word, you would use My & # 39; Pa $$ word instead.

  • Remove the offending special character from the parameter (for example, if it is a password, generate a new password without quotation marks).

I do not know what all the characters can cause this problem, but I know that single quotes will cause it, and I think it's safe to accept double quotes as well.

You can see which SQL Agent command is being executed by running the following query:

USE msdb
SELECT t1.command FROM sysjobsteps t1
JOIN sysjobs t2 ON t1.job_id = t2.job_id
WHERE t2.name = ''

The command contains a few /Par standard for $ServerOption::LOGGING_LEVEL and $ServerOption::SYNCHRONIZEHowever, if parameters have been changed in the job step, they will also be displayed on the command line, e.g.

/Par ""SQLServer_Password"";""PLAIN_TEXT_PASSWORD""

Note that for password fields, the password is stored in sysjobsteps in plain text format, so you definitely do not want to change this in the job step.

There may be other issues that could lead to invalid command errors, and anyone can edit this response to add more.

SSIS Environment Vairables!

I have an SSIS project. In this project, I have two connection managers (source and destination) when I try to use environment variables to deploy the ssis project for development and production. However, it runs in the production environment for the same development environment. No effect on changing the connection string of the production environment variable. However, it will run with the same development environment configuration.
Please suggest suitable articles or videos.

SQL Server – Using the SSIS Transactions To Rollback Package

I'm at a loss with an SSIS package I inherited. It contains: 1 script task 3 executing SQL tasks 5 data flow tasks (each containing a series of merges, lookups, data inserts and other transformations) 1 file system task of the package.

All of this is encapsulated in a foreach loop container. I was asked to change the package to reset the entire object if there was an error in one of the steps in the control / data flow. Now I have tried two different approaches to achieve this:

I. Using Distributed Transactions.

I made sure that:

MSDTC was running on the destination server and the client was running (screenshot attached). Msdtc.exe was added as an exception to the server and client firewall. Inbound and outbound rules have been set for server and client to allow DTC connections. ForeachLoop Container TrasanctionLevel: Required All Other Tasks TransactionLevel: Supported In my OLEDB connection, RetainSameConnection is set to TRUE, and I use SQL Server authentication with the Save password check box selected

When I Run the Package, It Fails Immediately After Scripting (First Step) After trying for a week to find a workaround, I decided to try out SQL tasks to perform my task with 3 SQL tasks can:

BEGIN TRAN before the FOReach loop container COMMIT TRAN after the ForeachLoop container with a success restriction ROLLBACK TRAN after the ForeachLoop container with an error limit

In this case, the TransactionLevel property is set to Supported for each loop container and all other tasks. The problem here is that the package runs until the fourth data flow task and hangs there forever. After logging in to SQL Server and checking the sessions I ran, I noticed sys.sp_describe_first_result_set; 1 as a headblocker session

In some research, I found that this may be related to some TRUNCATE statements in some of my data flow tasks that may cause a schema lock. I changed the ValidateExternalMetaData property to False for all the tasks in my data flow and changed my truncated statements to DELETE statements instead. Rerun the packet and still hang in the same place with the same headblocker. As an alternative, I tried to create a second OLEDB connection to the same database, assigning this new OLEDB connection to my BEGIN, ROLLBACK, and COMMIT SQL tasks, setting RetainSameConnectionProperty to TRUE and changing RetainSameConnectionProperty to FALSE (and so on) also tried with TRUE). in the original OLEDB connection (used by the data flow tasks). This worked in the sense that the package appeared to be running (it ran and Commit Tran was running properly) and I then re-executed it with a forced error so that it failed and the Rollback TRAN task ran successfully when I ran the affected person requested tables, the transaction was not reset, all new records were inserted, and old ones were updated (the startup process was clearly started in a different connection and therefore had no effect on the package's workflow). I'm not sure what else to try. Any help would be really grateful. I'm about to get involved!

P.S. In addition, DelayValidation is set to true for all objects, and the SQL Server version is 2012.

SQL Server – import Chinese characters via SSIS

I'm trying to import a Chinese characters file into a spreadsheet. The file encoding is Big5 (traditional).

Here is a sample file that I need to import: https://www.pastiebin.com/5d7782d9b63fa

The table into which the file is to be imported has the following structure:

create table dbo.Test (
      AccountId   numeric(18, 0) not null
    , Province    nvarchar(50)       null
    , City        nvarchar(50)       collate Chinese_Hong_Kong_Stroke_90_CI_AI
    , Country     nvarchar(50)       null
    , Gender      nvarchar(50)       null
)

If I import with OPENROWSET/BULK then all data will be transferred correctly:

select AccountId, Province, City, Country, Gender
from openrowset (
      bulk 'C:chinese_sample.dat'
    , firstrow = 1
    , formatfile = 'C:chinese_sample.xml'
) t

Here's a format file I'm using: https://www.pastiebin.com/5d7783396f9e4

Enter image description here


However, when I try to import a file with SSIS, Chinese characters are not parsed correctly.

In Flat File Source, I use DataType string (DT_STR) and CodePage 950. Then convert it to Unicode string (DT_WSTR).

Enter image description here

To import text into the table:

Enter image description here

As we can see, some characters are analyzed correctly and others are not.
What do I miss?