algebraic topology – How to prove/disprove that a map is a covering space?

Im right now learning some algebraic topology, and i have a problem with showing, that given maps are covering spaces. I understand what a covering space is “definitionwise”, but not how I can best proceed in showing that something actually is a cover. Could someone explain this to me, maybe in an example please?

postgresql – Postgres Query Slow With Covering Indexes

I have the following tables:

Spring ACL:

create table acl_sid(
    id bigserial not null primary key,
    principal boolean not null,
    internal  boolean not null, -- Hasan Can Saral
    sid varchar(100) not null,
    constraint unique_uk_1 unique(sid,principal)
);

and Spring Security:

CREATE TABLE IF NOT EXISTS users
(
    id                     NUMERIC(20, 0)                           NOT NULL PRIMARY KEY,
    email                  VARCHAR(255)                             NOT NULL,
    password               VARCHAR(60)                              NOT NULL,
    first_name             VARCHAR(255)   DEFAULT NULL              NULL,
    last_name              VARCHAR(255)   DEFAULT NULL              NULL,
    ...  
);

and

CREATE TABLE IF NOT EXISTS authorities
(
    user_id NUMERIC(20, 0) NOT NULL,
    sid_id  BIGINT         NOT NULL,
    CONSTRAINT fk_authorities_users FOREIGN KEY (user_id) REFERENCES users (id) ON UPDATE CASCADE ON DELETE CASCADE,
    CONSTRAINT fk_authorities_sids FOREIGN KEY (sid_id) REFERENCES acl_sid (id) ON UPDATE CASCADE ON DELETE CASCADE
)

CREATE UNIQUE INDEX ix_auth_username ON authorities (user_id, sid_id);

and an application-related table:

CREATE TABLE IF NOT EXISTS entities
(
    id         NUMERIC(20, 0)                           NOT NULL PRIMARY KEY,
    some_field VARCHAR(15)                              NOT NULL,
    ...
    created_at TIMESTAMP      DEFAULT CURRENT_TIMESTAMP NOT NULL,
    created_by NUMERIC(20, 0) DEFAULT NULL              NULL,
    updated_at TIMESTAMP      DEFAULT NULL              NULL,
    updated_by NUMERIC(20, 0) DEFAULT NULL              NULL,
    CONSTRAINT fk_entities_created_by FOREIGN KEY (created_by) REFERENCES users (id) ON UPDATE CASCADE ON DELETE SET NULL,
    CONSTRAINT fk_entities_updated_by FOREIGN KEY (updated_by) REFERENCES users (id) ON UPDATE CASCADE ON DELETE SET NULL
);

CREATE INDEX entities_created_by_idx ON entities (created_by);
CREATE INDEX entities_updated_by_idx ON entities (updated_by);

The following query was taking too long (~10 seconds):

SELECT e.*,
       t1.*,
       t2.*,

       u.id          AS cre_by_id,
       u.email       AS cre_by_email,
       u.password    AS cre_by_password,
       u.first_name  AS cre_by_first_name,
       u.last_name   AS cre_by_last_name,
       u.created_at  AS cre_by_created_at,
       u.updated_at  AS cre_by_updated_at,

       u2.id         AS upd_by_id,
       u2.email      AS upd_by_email,
       u2.password   AS upd_by_password,
       u2.first_name AS upd_by_first_name,
       u2.last_name  AS upd_by_last_name,
       u2.created_at AS upd_by_created_at,
       u2.updated_at AS upd_by_updated_at,

       u3.id         AS t1_cre_by_id,
       u3.email      AS t1_cre_by_email,
       u3.password   AS t1_cre_by_password,
       u3.first_name AS t1_cre_by_first_name,
       u3.last_name  AS t1_cre_by_last_name,
       u3.created_at AS t1_cre_by_created_at,
       u3.updated_at AS t1_cre_by_updated_at,

       u4.id         AS t1_upd_by_id,
       u4.email      AS t1_upd_by_email,
       u4.password   AS t1_upd_by_password,
       u4.first_name AS t1_upd_by_first_name,
       u4.last_name  AS t1_upd_by_last_name,
       u4.created_at AS t1_upd_by_created_at,
       u4.updated_at AS t1_upd_by_updated_at,

       u5.id         AS t2_cre_by_id,
       u5.email      AS t2_cre_by_email,
       u5.password   AS t2_cre_by_password,
       u5.first_name AS t2_cre_by_first_name,
       u5.last_name  AS t2_cre_by_last_name,
       u5.created_at AS t2_cre_by_created_at,
       u5.updated_at AS t2_cre_by_updated_at,

       u6.id         AS t2_upd_by_id,
       u6.email      AS t2_upd_by_email,
       u6.password   AS t2_upd_by_password,
       u6.first_name AS t2_upd_by_first_name,
       u6.last_name  AS t2_upd_by_last_name,
       u6.created_at AS t2_upd_by_created_at,
       u6.updated_at AS t2_upd_by_updated_at,

       s.sid         AS cre_by_authority,
       s2.sid        AS upd_by_authority,
       s3.sid        AS t1_cre_by_authority,
       s4.sid        AS t1_upd_by_authority,
       s5.sid        AS t2_cre_by_authority,
       s5.sid        AS t2_upd_by_authority

FROM entities e
         
         LEFT JOIN t1 ON ...
         LEFT JOIN t2 ON ...
         LEFT JOIN users u ON e.created_by = u.id AND u.deleted_at IS NULL
         LEFT JOIN users u2 ON e.updated_by = u2.id AND u2.deleted_at IS NULL
         LEFT JOIN users u3 ON t1.created_by = u3.id AND u3.deleted_at IS NULL
         LEFT JOIN users u4 ON t1.updated_by = u4.id AND u4.deleted_at IS NULL
         LEFT JOIN users u5 ON t2.created_by = u5.id AND u5.deleted_at IS NULL
         LEFT JOIN users u6 ON t2.updated_by = u6.id AND u6.deleted_at IS NULL
         LEFT JOIN authorities a ON u.id = a.user_id
         LEFT JOIN authorities a2 ON u2.id = a2.user_id
         LEFT JOIN authorities a3 ON u3.id = a3.user_id
         LEFT JOIN authorities a4 ON u4.id = a4.user_id
         LEFT JOIN authorities a5 ON u5.id = a5.user_id
         LEFT JOIN authorities a6 ON u6.id = a6.user_id
         LEFT JOIN acl_sid s ON a.sid_id = s.id
         LEFT JOIN acl_sid s2 ON a2.sid_id = s2.id
         LEFT JOIN acl_sid s3 ON a3.sid_id = s3.id
         LEFT JOIN acl_sid s4 ON a4.sid_id = s4.id
         LEFT JOIN acl_sid s5 ON a5.sid_id = s5.id
         LEFT JOIN acl_sid s6 ON a6.sid_id = s6.id
         JOIN (SELECT id
               FROM (SELECT DISTINCT (e.id), ROW_NUMBER() OVER (ORDER BY e.some_date DESC) AS row
                     FROM entities t WHERE some_col = :someCol) entities_table
               WHERE row <= :limit * :page + :limit
                 AND row > :limit * :page) pagination_table ON d.id = pagination_table.id
ORDER BY t.some_date DESC;

So I think which JOINs might make the query slow and I remove:

s.sid         AS cre_by_authority,
s2.sid        AS upd_by_authority,
s3.sid        AS t1_cre_by_authority,
s4.sid        AS t1_upd_by_authority,
s5.sid        AS t2_cre_by_authority,
s5.sid        AS t2_upd_by_authority

and the query is fast again (~100ms). So there might be something with the indexes on acl_sid I know that an index defined like this:

CREATE INDEX ix_auth_username ON authorities (user_id, sid_id);

covers an index only for user_id column (The one I should create for the user_id FK). I don’t have it exactly but pretty close: CREATE UNIQUE INDEX...

But if I create the index on user_id anyways:

CREATE INDEX user_id_idx ON authorities (user_id);

The query is fast again. Shouldn’t this index be covered with:

CREATE UNIQUE INDEX ix_auth_username ON authorities (user_id, sid_id);

What am I missing?

set theory – Bounds for a covering number of the circle group $mathbb T$ by some its small subgroups

$newcommand{w}{omega}newcommand{A}{mathcal A}newcommand{F}{mathcal F}newcommand{I}{mathcal I}newcommand{J}{mathcal J}newcommand{M}{mathcal M}newcommand{N}{mathcal N}newcommand{x}{mathfrak x}newcommand{cov}{mathrm{cov}}newcommand{lac}{mathrm{lac}}newcommand{non}{mathrm{non}}
newcommand{IT}{mathbb T}$
Taras Banakh and I proceed a long quest answering a question of ougao at Mathematics.SE.

Recall that a circle $mathbb T={zinmathbb C:|z|=1}$, endowed with the operation of multiplication of complex numbers and the topology inherited from $mathbb C$ is a topological group. We consider a cardinal $cov(A(IT))$ which is the smallest size of a family $mathcal U$ of strictly increasing sequences $(u_n)_{ninomega}$ of natural numbers such that for each $zinIT$ there exists $(u_n)_{ninomega}inmathcal U$ such that a sequence $(z^{u_n})_{ninomega}$ converges to $1$.
It would be ideally for us to find a known small cardinal equal to $cov(A(IT))$. While $cov(A(IT))$ remains unknown, we are interested in bounds for it by known small cardinals.

Our try.

Upper bounds.

Let $(w)^w$ denote the family of all infinite subsets of $w$. A subfamily $mathcal
Rsubseteq(w)^w$
is called reaping if for any set $Xin(w)^w$ there is $Rinmathcal R$
such that one of sets $Rcap X$ and $Rsetminus X$ is finite. The reaping number $mathfrak
r$
is the cardinality of the smallest reaping family. By Proposition 9.9 from (1),
$mathfrak r$ is the minimum cardinality of any ultrafilter pseudobase. Recall that a
pseudobase for a filter $F$ on $omega$ is a family $mathcal P$ of infinite subsets of
$omega$ such that every set in $F$ has a subset in $mathcal P$.

A family $mathcal R$ of infinite subsets of $omega$ is called $sigma$-reaping,
if for any countable family $mathcal X$ of infinite subsets of $omega$ there
is $Rinmathcal R$ such that for any $Xin mathcal X$ one of sets $Rcap X$ and $Rsetminus X$
is finite. The $sigma$-reaping number $mathfrak r_sigma$ is the cardinality
of the smallest $sigma$-reaping family. Clearly, $mathfrak rlemathfrak r_sigma$
and there is an old open problem whether $mathfrak r<mathfrak r_sigma$ is consistent,
see (4), (3), and (1, 3.6).
By cite(3), $mathfrak r_sigmalemathfrak u_p$,
where $mathfrak u_p$ is the smallest base of a $P$-point if a $P$-point exists and $mathfrak u_p=mathfrak c$ if no $P$-point exists. It is known that $mathfrak u_p=mathfrak u$ if $mathfrak u<mathfrak d$. Let us recall that $mathfrak u$ is the smallest cardinality of a base of a free ultrafilter on $omega$.

By Theorem 3.7 from (1), $mathfrak r_sigma$ is equal to the smallest cardinality of a
family $mathcal Rsubseteq(w)^w$ such that for any bounded sequence of real numbers
$(x_n)_{ninw}$ there exists $Rinmathcal R$ such that the subsequence $(x_n)_{nin R}$ converges
in the real line. It easily follows that $cov(A(IT))lemathfrak r_sigma.$

Problem. Is $cov(A(IT))lemathfrak r$?

Lower bounds.

For any family $I$ of sets with $bigcupInotinI$ let $cov(I)=min{|J|:JsubseteqI;wedge;bigcupJ=bigcupI}$ and $non(I)=min{|A|:AsubseteqbigcupI;wedge;AnotinI}$. Let $M$ and $N$ be the ideals of meager and Lebesgue null subsets of the real line, respectively.

It is easy to show that $cov(A(IT))gemax{cov(M),cov(N),x}$, where $x$ is an auxiliary cardinal introduced as follows. An infinite set $Rsubseteqomega$ of natural numbers is called remote if there exists $zinIT$ such that $inf_{nin R}|z^n-1|>0$. Let $x$ be the smallest cardinality of a family $F$ of infinite subsets of $omega$ such that for any remote set $R$ there exists $FinF$ such that $Fcap R$ is finite. So it would be good for us to find a known small cardinal equal to $x$. While $x$ remains unknown, we are interested in bounds (especially lower) for it by known small cardinals.

Our try for $x$.

We can prove that $cov(M)le x$ and are interested whether this bound can be improved and whether $cov(N)le x$.

Lyubomyr Zdomskyy suggested that it is consistent that $mathfrak d<x$, where $mathfrak d$ is the cofinality of $w^w$ endowed with the natural partial order: $(x_n)_{ninw}le (y_n)_{ninw}$ iff
$x_nle y_n$ for all $i$.

We introduced an auxiliary cardinal $x_{lac}$, which is the smallest cardinality of a family $F$ of infinite subsets of $w$ such that for any lacunary set $L$ there exists $FinF$ such that $Fcap L$ is finite. Recall that an infinite set $L$ of natural numbers is called lacunary, if $inf{b/a:a,bin L,;a<b}>1$. We have $x_laclex$, because Pollington in (2) proved that any lacunary set is remote, as John Griesmer informed us. But it turned out that $x_lac$ is rather small. Namely, Will Brian showed that $x_laclenon(N)$ and the strict inequality here is consistent.

References

(1) A. Blass, Combinatorial Cardinal Characteristics of the Continuum, in: M. Foreman, A. Kanamori (eds.), Handbook of Set Theory, Springer Science+Business Media B.V. 2010, 395–489.

(2) Andrew D. Pollington, On the density of sequences ${n_kxi}$, Ill. J. Math. 23* (1979) 511–515, ZBL0401.10059.

(3) J. Vaughan, Small uncountable cardinals and topology, Open problems in topology (J. van Mill and G. Reed, eds.), North-Holland, Amsterdam, 1990, 195–218.

(4) P. Vojtáš, Cardinalities of noncentered systems of subsets of $omega$, Discrete Mathematics 108 (1992) 125–129.

Thanks.

graph theory – Covering edge in a DAG

The book from which I am studying graph theory has this definition of a covering edge:

If $a$ and $b$ are distinct nodes of a digraph, then $a$ is said to cover $b$ if there is an
edge from $a$ to $b$ and every path from $a$ to $b$ includes this edge. If $a$ covers $b$, the
edge from $a$ to $b$ is called a covering edge.

From what I understand from this definition, $a$ is said to cover $b$ if:

  • There is an edge from $a$ to $b$
  • There is only $1$ path from $a$ to $b$. (Since any other path from $a$ to $b$ would not traverse this edge.)

If I am right, then the covering edges of this DAG would be {($1$,$2$),($1$,$3$),($1$,$5$),($2$,$4$),($2$,$6$),($3$,$6$)}.
DAG

Am I correct? If not, then what should be the covering edges in this digraph?

usa – Any travel insurances covering canceled flights/trains before the day of departure?

Suppose we’re dealing with the following scenarios (that actually happened to me):

  • Being in the middle of a trip in the US, Amtrak canceled the train that I was supposed to take 10 days before the scheduled departure date, due to wildfires. So I had to book an overpriced flight just 10 days before the departure date and spend an extra night (that I was planning to spend on the train) in a hotel.

  • Being in the middle of a trip in the US, an airline cancels the flight one day before the departure without rescheduling, and the next flight is only in 3 days. So I had to spend 3 more nights in a hotel.

Are there any travel insurances that would have covered those additional expenses (i.e., extra night(s) in a hotel/hostel, and a flight in the first case)? I only had CSR travel insurance when this happened, but I believe it doesn’t cover any of the above.

computational geometry – Approximation algorithm for minimal Covering of an orthogonal polyhedron

Covering an orthogonal polygon with rectangles is according to Culberson and Reckhow NP-complete, even for the case without holes. Franzblau shows an 2-approximation algorithm for simple polygons for this NP-complete problem that was later also shown to be an 8/3-approximation algorithm for the general case.

I am currently looking for such an approximation algorithm for the 3-dimensional case, for the minimal covering of orthogonal polyhedra with cuboids (both for ones with and without holes). Is there a suitable solution?

Otherwise, my solution would be to extend Franzblaus algorithm for this case.

Thanks a lot for the help!

Travel Insurance covering Covid-related entry bans for US residents

Do any of the policies cover trip cancellation if a country bans citizens from your particular home country from coming into their country due to Covid?

Typically not.

Insurance policies vary a lot so you need to carefully study the details of your specific policy to find out. Personally I found websites like sqauremouth.com very helpful, since you can read the full policy before you buy (no advertisement or endorsement intended).

It appears that most insurance would require a “Cancel for any reason” policy to cover that specific case. However these are rather expensive and typically don’t cover all of the travel cost (typically 75%).

Personally, I just do medical travel insurance since it’s inexpensive and the potential costs could be massive. Travel cost coverage often costs 5%-10% of the total trip price and my track record of making trips is much better than that.

Even if you get banned from entering Austria on short notice, you may still be able to negotiate something with the airlines. It’s also possible that the airline will have to cancel their flight, in which case you would get a full refund.

Some sources:

reference request – What is the covering density of a very thin annulus? Is it $frac{pisqrt{51sqrt{17}-107}}{16}$?

Take some very small $epsilon>0$, and consider the annulus/ring given by the set ${(r,theta) | 1-epsilonle rle1}subset mathbb{R}^2$.

We wish to place translated copies of this annulus down so that they cover the plane; obviously, this will cover some points multiple times, since the rings do not tile the plane without overlap. How can we do this to minimize the overall density, i.e., the number of times an average point is covered?

I can obtain a density of $pi$ with the following construction (overlapping rings shown in darker shades):

enter image description here

However, it turns out this is suboptimal; we can do better by only placing $2/epsilon$ of these rings in a line, and covering the plane with the resulting shapes:

enter image description here

This uses $2/epsilon$ rings of area $2piepsilon$ each, for a total area of $4pi$, per $2times 3$ rectangle in the tiling, so its density is $frac{2pi}3 approx 2.094$.

We can improve this further by overlapping the above shapes vertically (as before, each of these is formed from $2/epsilon$ rings):

enter image description here

A bit of calculus tells us this construction is optimized when the vertical overlap between two of the red regions is $2-sqrt{frac{3sqrt{17}-5}2}$, for a total density of

$$frac{pisqrt{51sqrt{17}-107}}{16}approx 1.99954$$

Is this optimal? I’m curious about both improvements to this construction, and lower bounds that can be imposed on the density; so far I have not been able to establish lower bounds greater than $1$. Pointers to literature on this or related questions would also be welcome.

(It’s fairly easy to show that a random point on a given annulus will be covered an average of at least $1+1/pi$ times, but this oversamples multiply-covered points, so it doesn’t tell us anything directly about the covering density.)

combinatorial optimization – Integer programming for bin covering problem

I encounter an integer programming problem like this:

Suppose a student needs to take exams in n courses {math, physics, literature, etc}. To pass the exam in course i, the student needs to spend an amount of effort e_i on course i. The student can graduate if she/he passes 60% of the n courses (courses have different weights). The objective is to allocate her/his efforts to different courses such that the student can graduate with the minimal amount of efforts spent on courses.

I think this problem is similar to bin covering problem when there is only one bin. The formulation is simple. Use x_iin{0,1} to denote whether the student allocates effort to course i. Let w_i denote the weight of course i in calculating the final score.

Min sum x_i e_i

s.t. sum x_i w_i >= 60% * n (or some other predetermined threshold)

My question is, is there a simple heuristic solution for this problem?