functional analysis – Equivalent norms in the intersection

Let $V$ be a vector space. Two norms $|cdot |_1,|cdot |_2: V longrightarrow mathbb{R}$ in $V$ are said equivalent if there exists $a,b>0$ such that
$$
a|u|_1 leq |u|_2 leq b |u|_1,; forall ; u in V.
$$

Now,, consider $X=(X, |cdot|_X)$ and $Y=(Y, |cdot|_Y)$ two normed spaces. It is easy to show that $Z:= X cap Y$ is a normed space with norm $|cdot|_Z: Z longrightarrow mathbb{R}$ given by
$$
|u|_Z=|u|_X+|u|_Y,; forall ; u in Z.
$$

Question. If, for some some $alpha, beta>0$, we consider ${rm N}: Z longrightarrow mathbb{R}$ as
$$
{rm N}(u)=alpha |u|_X +beta |u|_Y,; forall ; u in Z
$$

then ${rm N}$ defines a equivalent norm (in $Z$) with respect the norm $|cdot|_Z$?

I don’t see, for instance, how to prove that
$$
alpha |u|_X +beta |u|_Y leq gamma( |u|_X + |u|_Y),; forall ; u in Z tag{1}
$$

for some $gamma>0$. Can I to use some property of $max$ or $min$? Or, can I only prove $(2)$ instead of the equivalence of norms? The equivalence is more stronger than $(2)$.

Was a quotient of two norms considered as a constraint to a convex optimization problem before?

I want to solve the optimization problem
$$
text{minimize }g(x) quad text{subject to} quad Vert xVert_{infty}/Vert xVert_{2} le s
$$

for $xinmathbb{R}^d$ and $sin(0,infty)$.
The function $g$ is (strongly) convex and Lipschitz smooth.

I know, that I could probably try to find saddle points of the corresponding Lagrangian but I would like to know, if there is a faster or more elegant way.

Do you know of a similar problem, that has been considered before?

norms – What are the functions such that $ lVert f + g rVert_p^p = lVert f rVert_p^p + lVert g rVert_p^p$?

Let $1 leq p leq 2$. I am looking for a characterization of the couples $(f,g)$ of functions $f,g in L_p(mathbb{R})$ such that
$$ lVert f + g rVert_p^p = lVert f rVert_p^p + lVert g rVert_p^p.$$

For $p = 2$, this relation is satisfied if and only if $langle f, g rangle = 0$. For $p = 1$, it has been shown in this post that the condition is equivalent to $f g geq 0$ almost everywhere.
For a general $p$, the relation is clearly satisfied as soon as the product $fg=0$ almost everywhere. Is this latter sufficient condition also necessary?

If not, then what if we reinforce the condition with
$$ lVert alpha f + beta g rVert_p^p = lVert alpha f rVert_p^p + lVert beta g rVert_p^p$$
for any $alpha, beta in mathbb{R}$?

fa.functional analysis – Estimating certain tensor norms on Banach spaces

Let $X$ and $Y$ be Banach spaces. An operator $u:Xto Y$ s called nuclear if $u$ can be written as $u=sum_{n=1}^infty x_n^*otimes y_n$ with $(x_n^*)subseteq X^*$, $(y_n)subseteq Y$ such that $sum_{n=1}^infty|x_n^*||y_n|<infty.$ Define $N(u):=inf{sum_{n=1}^infty|x_n^*||y_n|}$ infimum being taken over all representations. Denote $C(n):=sup{N(BA):|A|_{ell_1^ntoell_infty^n}leq 1, |B|_{ell_infty^ntoell_infty^n}leq 1}.$ Is $suplimits_{ngeq 1}C(n)<infty$?

gn.general topology – Do topologically equivalent norms have the same $C^n(V)$?

If $lVertcdotrVert_1$ and $lVertcdotrVert_2$ are topologically equivalent norms on a vector space $V$, and $f:V rightarrow R$, is it true that $f$ is $C^n$ when differentiated with $lVertcdotrVert_1$ implies $f$ is $C^n$ when differentiated with $lVertcdotrVert_2$?

I’m not sure where to begin, but one gets a sense if $C^infty$-ness is defined for manifolds which only have open sets, then the derivative of $f$ might only depend on the open sets and not choice of norm.

banach spaces – Uniform smoothness and twice-differentiability of norms

To get to the simplest case, consider a norm $|cdot|$ over $R^n$ that is uniformly convex of power-type 2, that is, there is a constant $C$ such that $$frac{|x+y| + |x – y|}{2} le 1 + C |y|^2$$ for all $x$ with $|x| = 1$ and for all $y$.

Question: Does this guarantee that $|cdot|$ has a second-order Taylor expansion on $R^n setminus {0}$, that is, there is a vector $g$ and a symmetric matrix $A$ such that $$|x + y| = |x| + langle g, y rangle + frac{1}{2} langle Ay, y rangle + o(|y|^2)$$ for all $x neq 0$. (Apparently this is a weaker requirement than twice-differentiability of $|cdot|$ on $R^n setminus {0}$)

It is easy to see that $|cdot|$ is differentiable on $R^n setminus {0}$, and a classic result of Alexandrov guarantees that the above second-order Taylor expansion holds for any convex function on almost every point $x$. It is also known that the norm of any separable Banach space can be approximated arbitrarily well by a power-type 2 norm that is twice differentiable on $R^n setminus {0}$ (see Lemma 2.6 here). But I wonder if the original norm itself has a second-order Taylor expansion.

linear algebra – Frobenius and operator norms of a certain structured matrix

$defp#1#2{frac{partial #1}{partial #2}}defR{mathbb R}$

Let $X in mathbb R^{n times d}$ and $y in mathbb R^{n times 1}$ and $u$ be a unit vector in $mathbb R^{k times 1}$. Define the function $F:mathbb R^{d times k} to mathbb R$ by $F(W) := y^top psi(XW)u$ for some twice-differentiable function $psi:mathbb R to mathbb R$, and $psi(A)$ is the matrix with $ij$th entry $psi(a_{i,j})$ (i.e element-wise application of $psi).

For any $W in mathbb R^{d times k}$, consider the the hessian $nabla^2 F(W) in mathbb R^{dk times dk}$.

Suppose $|psi”|_infty := sup_{t in mathbb R} |phi”(t)| le alpha$, $|X|_{op} le beta$, and $|y| le gamma$.

Question What are good upper-bounds for $sup_W |nabla F(W)|_F$ and $sup_W |nabla^2 F(W)|_{op}$ in terms of $alpha$, $beta$, and $gamma$ ?

Define
Define the matrices
$$eqalign{
Z &= XW qquad&impliesquad dZ = X,dW \
P &= psi(Z) qquad&impliesquad dP = Qodot dZ \
Q &= psi'(Z) qquad&impliesquad dQ = Rodot dZ \
R &= psi”(Z) \
}$$

where $(odot)$ denotes the elementwise/Hadamard product and $,(psi’,psi”),$ denote the ordinary first and second derivatives of the $psi$ function.

In this post, it has been shown that

If $H := nabla^2 F(W)$ and $:$ denotes trace inner product, then

$$eqalign{
{cal H}_{j,j,ell,ell’} &= x_ell^Tbig(r_jodot yodot x_{ell’}big);u_jdelta_{j,j’} \
}$$

where the vectors $(x_ell,r_ell)$ are the $ell^{th}$ columns of the $(X,R)$ matrices.
$$


A bound for the Frobenius / Hilbert-schmidt norm

$$
begin{split}
|H|_F^2 &= sum_{l=1}^dsum_{l’=1}^dsum_{j=1}^k((y odot x_l)^top(r_j odot x_{l’}))^2u_j^2 le sum_{l=1}^dsum_{l’=1}^dsum_{j=1}^k u_j^2|y odot x_l|^2|r_j odot x_{l’}|^2\
&le sum_{l=1}^dsum_{l’=1}^d|y odot x_l|^2sum_{j=1}^k u_j^2|r_j odot x_{l’}|^2 le sum_{l=1}^dsum_{l’=1}^d|y odot x_l|^2sum_{j=1}^k u_j^2|r_j odot x_{l’}|^2 \
&le sum_{l=1}^d|y odot x_l|^2sup_{j=1}^k sum_{l’=1}^d|r_j odot x_{l’}|^2 = |X^top y|^2sup_{j=1}^k|X^top r_j|^2\
& le |X|_{op}^4|y|^2sup_{j=1}^k|r_j|^2 = beta^4gamma^2sup_{j=1}^k|r_j|^2 le alpha^2beta^4gamma^2.
end{split}
$$

Therefore $|nabla^2 F(W) |_F le alphabeta^2gamma$ for all $W in mathbb R^{d times k}$.

oc.optimization and control – Are all corecive norms obtained from bilinear forms on Hilbert spaces

Let $H$ be a Hilbert space with inner-product induced norm $|cdot|_H$. If $|cdot|$ is some other coercive norm on $H$, i.e.:
$$
limlimits_{|x|_Htoinfty} |x|=infty.
$$

then does there necessary exist a lower-semi-continuous symmetric Bilinear form $B:Htimes Hrightarrow (0,infty)$ satisfying
$$
|x|=sqrt{B(x,x)}?
$$