The arguments to cg() are pointers to integers. Now let's look at the first line of cg(): (*a)*=3;.
(*a) dereferences a, so this expression is the integer that a pointer to. *= multiplies the left side by the right side, so (*a)*=3 is the same as (*a) = (*a) * 3;.
Finally, going back up to main() where cg is called we have cg(&x, &y, &z); Here & takes the address of its argument. So x is an integer, &x is the address of x.
Whoever gave you this code should be tossed into a woodchipper. Look at how the parameter names are changed between the declaration and the definition. The ones in the definition matter, and they are declared arbitrarily in reverse of both their alphabetic order and the order in which they're used in the function. Looks like this code was made to intentionally trip you up. Here's some functionally equivalent code:
#include <iostream>
usingnamespace std;
//forward declaration, no parameter names needed, only types matter
void cg(int*, int*, int*);
int main()
{
int x = 3;
int y = 4;
int z = (5 - x); //2
cg (&x,&y,&z);
cout<< x << " " << y << " " << z;
return 0;
}
void cg (int * a, int * b, int * c)
{
(*a) *= 1;
(*b) *= 2;
(*c) *= 3;
}