convex optimization – Lagrange function for minimizing vector with bounded “parameters”

for some days I am trying to solve this problem, but I did not find a smooth approach. Therefore, I would need some help from you guys:

$min$ $y_0$, subject to:
(I): $0<= y_0,y_1,y_2,…,y_k <= 1 $
(II): $Y_{lb} <= acdot y_0+b cdot y_1+ccdot y_2+….dcdot y_k <= Y_{up}$

with a,b,c,… being constants

So, the goal is to find a vector $ y_0, y_1,… y_k$. Where k can be fixed to $k=(0,3)$.
The problem, where I got stuck with is, that $y_0$ has to be minimized whereby the other $y_1,…,y_k$ are only bounded.

What I kind of like to have would be a Lagrange function which I can then solve by usage of KKT Conditions. But since I don’t have a function $f(y_k)$, I am having my troubles with this one.

My idea was to define a function which is very sensitive to $y_0$ and not that much towards the rest, like: $f(y) = e^{y_0} + log(y_1) + log(y_2) + log(y_3)$

All approaches work on estimating all y_k at their minimum not only the stated y_0 one.
For some ideas I would be very happy.
Thanks a lot in advance !