Ecommerce Warehouse Fulfilment Centre EU (COD Option)

Hi, are there any sugestions for a trusted and good warehouse fulfilment centre in Europe? That can fulfill in many countries in 48h, good prices and COD options? I am trying to find for the past 2 weeks , but i dont find no trustable ones, all of them have bad reviews, the only one i finded more trustuble is FHB group, but their price is insane. Thanks!

etl – Do Data Warehouse Standards Enable Foreign Key Restrictions on a Dimension Model?

Is it true that we never activate foreign key constraints in the data warehouse dimension model? If so, what is the reason for this?

According to my research:

Some experts told me in a dimensional model that FK will never be activated and it is the responsibility of the ETL process to ensure consistency and integrity.

Data integrity problems can arise, although ETL is sufficiently responsible due to the correct dependency.


  • Dimension arriving late from the source

  • Only a few records failed the data quality check and were forwarded to the error table.

  • Intermediate tables are not populated due to a batch load error, and the steps to properly restart or restore are not followed. Someone restarted the last session to load data into the fact table while some of the dimensions still need to be filled in.

  • Primary key constraints help me avoid duplicate record populations when processing intermediate table data again because the target table load session is accidentally triggered again.

What problems do you see when you activate FK constraints in the dimension model?

Azure SQL Data Warehouse user provisioning

If you provide a new Azure Active Directory user with access to an Azure SQL Data Warehouse database, the user must be added to the master Database?

This documentation only covers adding the Azure Active Directory user to the specific warehouse database and mapping these roles to a role in that database. However, I have found that if the user is not added, that will also be added master You cannot then log on to the database via SSMS.

I do the following:

from external provider

EXEC sp_addrolemember 'db_datareader', ''

use master
from external provider

Does the user always have to be added to the master? Is there a better way to configure security so that I don't always have to add the user to the master? Am I totally missing something here?

BLOWOUT SALE – Dedicated servers from USD 29 / month – LIMITED WAREHOUSE!

About QuadraNet – Established in 2001 and specialized in both self managed and fully managed Colocation solutions. We have On-site staff 24/7/365 Actual employees (no standby companies) are available to you who are able to solve problems on your behalf promptly and competently if required. When you work with us, you work with a plant operator and provider. You're not dealing with a downstream company that resells storage, power, and network connectivity. We are a financially stable and debt-free company that provides services to companies around the world – you can trust your equipment and reputation with us!

While this is our current special, Please contact us if you are looking for something other than what we have mentioned.

Dedicated server BLOWOUT – Valid in Los Angeles

Server specifications:

# AMD EPYC 7302P Sedecim-Core (128 MB cache, 3.0 GHz, AMD Infinity architecture)
# 32 GB DDR4 ECC4 ECC Registered Memory
# 1x 500 GB Solid State Drive (SSD)
# 30 TB QuadraNet Premium bandwidth
Public port with 1 Gbit / s
Private port with 1 Gbit / s
# / 29 IPv4 assignment – 5 usable IPs
# / 64 IPv6 assignment – millions of IPs
# 24/7 KVM over IP On-Demand
# Noction Intelligent Routing Platform Enabled Network
# 3 Gbps Detect and reduce QuadraNet VEST DDoS protection
# $ 279 / MONTH


— —.

The QuadraNet network

Based on our years of experience in the industry and after using and testing essentially every available network backbone, our list of networks was selected based on the strengths that each offers individually and in combination with the rest of the networks. We know what works, where it works, and how best to deploy it YOU with the highest available power.

QuadraNet Los Angeles backbones

GT-T / TiNet – Transit – 1 x 10 Gbit / s
PCCW / BTN – Transit – 1 x 10 Gbit / s
ChinaUnicom – Transit – 1 x 10 Gbit / s
Cogent Communications – Transit – 1 x 10 Gbit / s
Telia – Transit – 1 x 10 Gbit / s
China Telecom – Transit – 1 x 10 Gbit / s
AboveNet / Zayo – Transit – 1 x 10 Gbit / s
Equinix Exchange – Peering – 1 x 10 Gbit / s
Any2Exchange – Peering – 1 x 10 Gbit / s – Transit – 2 x 10 Gbit / s
HiNet Taiwan – Peering – 1 x 10 Gbit / s
Google Direct – Peering – 2x 10 Gbit / s
Plus hundreds of additional network peers

Facility includes:

# 100% Power Uptime SLA
# 99.999% Network Uptime SLA
# UPS battery backup power
# Automatic switchover
# Caterpillar Diesel Backup Power Generation

Contact our sales team at or by phone (888-5-QUADRA) to arrange a personal tour of the facilities!

Do you need anything else? We are not limited by the packages mentioned above! Contact us for a very special and only for you offer!

mysql – How do I create a debt data warehouse?

1. Goal
I'm trying to build a debt data warehouse and I don't know what is typical Roadmap I am extremely lost for such a project because I am new to the data warehouse world.

2. What I'm doing

My instructor recommended these three records

To get insight into debt related to GDP and the stock market.
I think my question here is doable or am I going wrong.

But honestly, I don't know how I would connect them.

3. Tools that I will use

Any help regarding the data warehouse roadmap, GDP and the stock market would be greatly appreciated as this is a very new area for me.


Should a data warehouse be used as a data source for external reporting?

This question is very similar to this question and this question, but it was so different that I thought it deserved a separate question.

I work for an organization that is currently building a data warehouse. I myself am very new to this concept and have read a few articles to understand the purpose and scope of a data warehouse.

I am currently working on a project that requires data to be entered an external provider to meet the reporting requirements required by law. Because the description of this functionality included the word "reporting," someone thought it a good idea to share this part of the project with the data warehouse team.

That seemed strange to me, but without knowing how much to complain about data warehouses (apart from the fact that the data warehouse team has no domain-specific knowledge), we make changes to a source system to support them Report requirements, and as far as I can tell, the data warehouse will not add value to the data from the source system. In other words, they will not clean the data or integrate it with data from another system. All you will do is take something could a series of online real-time notifications were sent to an external provider and converted into a batch-based night report that was sent to an external provider.

Is this just a complete misuse of a data warehouse?

tnsnames – How can I connect to an Oracle Autonomous Data Warehouse using a third-party IDE (DataGrip)?

I am trying to connect to an Oracle Autonomous Data Warehouse database using Jetbrains DataGrip. Oracle provides me with a wallet file (a zip file) that contains tnsnames.ora, a keystore, and some other files.

I'm having a lot of trouble using this information to connect to the database using DataGrip. I found a thread on the DataGrip support forums, but I'm not lucky with that either.

Jetbrains support thread:
Relevant Oracle documentation:

What I did:
1. Create the environment variable & # 39; TNS_ADMIN & # 39; and put it on:
C:\Users\xxx\Documents\(folder with wallet files)
2. The Oracle JDBC driver files (ojdbc8.jar, osdt_cert.jar, oraclepki.jar, osdt_core.jar) have been added to the standard Oracle driver in DataGrip
3. Edit the & # 39; sqlnet.ora & # 39; so that it contains the path to the wallet files
4. The following has been added to the data source VM options:\Users\xxx\Documents\(folder with wallet files)
  1. Only set the connection type to URL
  2. Tried different connection strings in the URL field:

jdbc:oracle:thin:@//\Users\xxx\Documents\(folder with wallet files)\Users\xxx\Documents\(folder with wallet files)



Connection to ADW1 failed.  
(08006)(17002) IO Error: Got minus one from a read call, connect lapse 32 ms.,  
Authentication lapse 0 ms.  

I also tried using the connection types "service name" and "TNS" and entered the information from "tnsnames.ora". No die, same mistake.

An attempt was also made to set the & # 39; tcp.validnode_checking & # 39; explicitly set to zero.

(The connection works well with sqldeveloper)

What's the right way to do this?

Data warehouse – store emails in a separate dimension or in a degenerate dimension?

I have just started learning dimensional modeling and am creating a star to analyze email newsletter signups for an online company.

I have a fact table that records registrations and links to a contact dimension, a traffic source dimension, a side dimension and a junk dimension with different flags.

I also have a degenerate dimension with the login ID of the source system.

But one of the things I can't figure out is the email address.

I don't need it for filtering, but it's good to have if I dig into the data and show the registrations.

The data has a high cardinality and contains approx. 70% unique emails and 30% duplicates.

Option 1: junk dimension

I was considering moving it to the junk dimension, but that would swell the rows from ~ 1,000 to currently 133,000.

This is clearly a bad choice

Option 2: email dimension

Having a dimension with only one column doesn't seem right. It would hardly save space.

Option 3: Degenerate dimension
Another possibility is to define it as a degenerate dimension, as would be the case with natural keys. As I understand, however, you should only use degenerate dimensions for values ​​that have a unique value for each row of facts.

In addition, the text values ​​mentioned should be avoided. Would 70% qualify for it?

Finally, it is probably the best option to record the email via the contact dimension and define it as SCD type 2.

But I already have a contact dimension, which is an SCD Type 1, and I don't currently have the know-how to properly implement and maintain a Type 2. In addition, the data is loaded once a day, so that no account is taken of the fact that a person logs on with one email and changes it to another email on the same day before the ETL retrieves the data.

In order to formulate my question more clearly, I would like to receive instructions on …

  • Does it make sense to have a dimension table with only one column and almost as many rows as the fact table?
  • Is it okay to use degenerate dimensions for values ​​that are not 1 to 1 in the fact table (but still have a high cardinality)?
  • How bad is it to add a text column to a fact table?