sql server – Test to see when table is candidate for compression (row or page)

Stephen’s answer pretty much sums up the information you’re looking for, just adding my two cents in, hopefully without being redundant.

Backups shouldn’t take any longer because of Page or Row Compression. Worst case they take about the same, and they can possibly be faster because part of the normal backup process is compressing the data. I would presume (haven’t tested yet) that already compressed data will measurably speed up that step of the backup process, since the work has already been done ahead of time.

With compression, you’re essentially trading more CPU resource consumption for improved Disk and Memory performance. Data is persisted on Disk and loaded to / from Memory compressed. As Stephen pointed out, it isn’t until it is delivered to the consuming application that it’s decompressed, which is when you may experience higher CPU resource usage. Or as you guessed, when new data needs to be written to the database, it also becomes compressed first.

You should monitor your CPU throughout the day, and with common and heavy tasks your environment’s workload typically endures. If you find your CPU is constantly pegged near 100% (and it previously wasn’t), then you may want to further investigate into if your compression settings are the main contributor, and possibly try testing loosening the settings on some of your most frequently used or larger tables / objects.

I always refer to sp_estimate_data_compression_savings (as Stephen mentioned) before changing compression, as it’s pretty helpful in telling you how beneficial it may be to compress an object one way or another vs not compressing all. You can use that to help you debug and see if you guys are unnecessarily compressing some tables at no benefit in return.

Though generally I’ve found modern CPUs work pretty well, even under high paced workloads (from my experiences with working with semi-big data), and have seen many opportunities to utilize wasted unused CPU resources. Regarding this, I like one of Brent Ozar’s mantras that unused server resources (to a degree) is a waste, when you’re paying for something you’re not using.

algorithms – Pruning: Outlier Candidate Selection

I am using method 3.3 described in Outlier Detection using Isolation Forest and Local Outlier
Factor
.

It states:

Specify a dataset: $𝐷={𝑑_1, 𝑑_2, …, 𝑑_𝑛}$.
Here, 𝑛 is the sample number of 𝐷. $𝑑_𝑖$ is an attribute in $𝐷$, and $𝑑_𝑖={𝑥_1,
𝑥_2, …, 𝑥_𝑛}.$
$𝑥_𝑗$
is a certain data value of the attribute $𝑑_𝑖$
.
The outlier coefficient of the attribute is defined as:
enter image description here

Here, $bar{𝑥}$ is the mean of the attribute $𝑑_𝑖$ and $𝑓𝑑_𝑖$
is used to
measure the degree of dispersion of the attribute $𝑑_𝑖$
. Calculate
the outlier coefficient of each attribute in the dataset, and
get the outlier coefficient vector $𝐷_𝑓$ of the dataset, which is
recorded as:
$𝐷_𝑓 = 𝑓𝑑_1, 𝑓𝑑_2, …, 𝑓𝑑_n$

However, I do not understand what value to select for $x_j$. What does a ‘certain data value’ mean in this case?

What fails in constructing a homotopy category out of candidate triangles in a triangulated category?

Following Neeman’s article “New axioms for triangulated categories”, for a triangulated category $mathscr T$ let $CT(mathscr T)$ denote the category of candidate triangles, i.e. diagrams
begin{equation}Xoverset fto Yoverset gto Zoverset hto Sigma Xquad (*)end{equation}
such that $gf=0$, $hg=0$ and $(Sigma f)h=0$, with morphisms being commutative diagrams between such triangles.
We can define homotopy of maps between candidate triangles to be chain homotopy and there is an automorphism $tildeSigmacolon CT(mathscr T)to CT(mathscr T)$ which takes $(*)$ to
$$Yoverset{-g}to Zoverset{-h}to Sigma Xoverset{-Sigma f}to Sigma Y.$$
We can define mapping cones as in a usual chain complex category, and a lot of the usual results hold for this category (e.g. homotopic maps have isomorphic mapping cones).

What I fail to see, is why the mapping cone construction along with $tildeSigma$ does not give rise to a triangulation of $CT(mathscr T)$?

image processing – Simple shape matching, refine ‘best’ candidate based on contour deviation ? Or alternative (Python/C++ and OpenCV etc.)

apologies if this question is not composed correctly I can revise if necessary.

I am essentially doing template matching from a large array of candidate images. I am actually interested in the ‘best’ available match – best being subjective, but ultimately what a human would likely perceive as closest (smooth outline, retaining shape, etc).

I have done a lot of work on identifying close matches, but Hu and Zernike scoring aren’t getting me that final step.

The image I have attached shows (I just drew these, to illustrate the point, they are not actual data) a template and two matches showing the deviation from the templates contour.

The red image shows (theoretical) deviations for a lower quality match and green a better quality match. Essentially I am defining the lower quality match as more extreme swings in the deviation from my templates contour – which I am trying to illustrate on my (poorly drawn) graph – I’m not sure how to extract this data, or analyze it effectively – just considering it.enter image description here

I am really looking for advice or concepts on how I can refine my ‘top 500’ hu scored matches to get the best defined shape match (or, at least improve my ranking). My actual data is a similar complexity ie not photos and I am certainly seeing some better matches with a worst hu score in my current ranking. I’m primarily using Python/C++ and OpenCV etc.

Any advice ?

candidate key – Solution verification for functional dependency using armstrong axioms

I have the following functional dependencies which includes all attributes of the relation:

Functional dependencies.

CF supposedly is a candidate key. Is the following reasoning correct:

ACF => AB (Augmentation)

AB => C

BC => BAD (Augmentation)

BAD => BED (Augmentation)

BAD => D (Decomposition)

D => E

On KDE Neon 5.20 I unable to install all package due to error Package ‘…’ has no installation candidate or Unable to locate package

I hope someone will help me to solving my issue that is On KDE Neon 5.20 I unable to install all package due to error Package ‘…’ has no installation candidate or Unable to locate package …

ie: I want to install some dependecy package for gesture touchpad

sudo apt-get install wmctrl python3 python3-setuptools xdotool python3-gi libinput-tools python-gobject

then the response is:

asus-dgn@ASUSVivoBook-X409JA:~$ sudo apt-get install wmctrl python3 python3-setuptools xdotool python3-gi libinput-tools python-gobject
Reading package lists... Done
Building dependency tree       
Reading state information... Done
Package python3-setuptools is not available, but is referred to by another package.
This may mean that the package is missing, has been obsoleted, or
is only available from another source

Package wmctrl is not available, but is referred to by another package.
This may mean that the package is missing, has been obsoleted, or
is only available from another source

Package xdotool is not available, but is referred to by another package.
This may mean that the package is missing, has been obsoleted, or
is only available from another source

E: Package 'wmctrl' has no installation candidate
E: Package 'python3-setuptools' has no installation candidate
E: Package 'xdotool' has no installation candidate
E: Unable to locate package python-gobject

also when I try to install neofetch or other apps/package with command:

sudo install neofetch

it will return response:

asus-dgn@ASUSVivoBook-X409JA:~$ sudo apt-get install neofetch
Reading package lists... Done
Building dependency tree       
Reading state information... Done
E: Unable to locate package neofetch

It’s occurs almost on all package. So i hope someone able to help me to solve this problems.

Thank you.