nt.number theory – Proof that (ca1)^x = (a^2x) + ((ab)^x) is wrong if only a and b have a gcd

How do I prove that if a, b and c are natural numbers then, this equation (ca1)^x = (a^2x) + ((ab)^x), where a1 is the prime number of a, is incorrect if gcd(a,b) >1 and gcd(b,c) =1 and gcd(a,c) =1?
Also, when gcd(a,b)>1, then b=gb1 and a=gb2, g is the gcd of a and b, and so we’ll have a/b=a1/b1.
Thanks for your attention.