Weird problem with custom vector class

So, I recently started doing some OpenGL stuff and have hit a problem I just can't figure out. I've created a class called "Scene" which, among other things, holds the background colour of the scene being rendered.

This is stored as a "Vec3f" object, which basically acts as a vector of three floats that I got from this site: http://www.videotutorialsrock.com/

However, when I try and draw the scene background in the right colour, it suddenly starts acting very strangely.

Here is the header for my "Scene" class:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
#include "vec3f.h"

#ifndef SCENE_H
#define SCENE_H

class Scene {

    private:
    Vec3f bgColour;
    
    public:
    Scene ();
    void set_bgColour (float, float, float);
    Vec3f get_bgColour ();
};

#endif 


And here is the .cpp file:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22

#include <iostream>
#include "scene.h"
#include "vec3f.h"
using namespace std;

Scene::Scene () {
    Vec3f bgColour (0.5f, 0.5f, 0.5f);
    cout << bgColour; //This outputs "(0.5f, 0.5f, 0.5f)", as you would expect
}

void Scene::set_bgColour(float newRed, float newGreen, float newBlue) {
    Vec3f bgColour (newRed, newGreen, newBlue);
    cout << bgColour; //This correctly outputs whatever values I set the colour as

Vec3f Scene::get_bgColour() {

    cout << bgColour; //But here is the problem! This always outputs something like "(0, -4.65661e-10, 0)", where the first and last floats are always 0 and the middle one is seemingly random.

    return bgColour;
}


And here is my "main.cpp":
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
#include <iostream>
#include <stdlib.h>
#include <math.h>
#include <string.h>

#ifdef __APPLE__
#include <OpenGL/OpenGL.h>
#include <GLUT/glut.h>
#else
#include <GL/glut.h>
#endif

#include "image.h"
#include "scene.h"
#include "vec3f.h"

using namespace std;

/* .... */

Scene* currentScene = new Scene ();

/* .... */

//initialise OpenGL
void init () {
    
    /* .... */
    
    currentScene->set_bgColour(0.7f, 0.3f, 0.3f); //This outputs "(0.7f, 0.3f, 0.3f)"
}

/* .... */

void render_scene () {

    Vec3f bgColour = currentScene->get_bgColour(); //But this always outputs "(0, [random numbers], 0)" 
    
    glClearColor(bgColour[0], 
                 bgColour[1], 
                 bgColour[2], 1);
    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
    
    glutSwapBuffers();
}

int main (int argc, char** argv) {
    
    /* .... */ 

    init();
    
    glutDisplayFunc(render_scene);
    
    /* .... */
    
    glutMainLoop();
    return 0;
}


I could also post the "Vec3f" files if needed, although they are a bit more lengthy. Thanks for reading, I'd really appreciate help on this problem!

This code:

1
2
3
void Scene::set_bgColour(float newRed, float newGreen, float newBlue) {
    Vec3f bgColour (newRed, newGreen, newBlue);
    cout << bgColour; }


Your code creates a new Vec3f object with the values (newRed, newGreen, newBlue), and then outputs that new Vec3f object, and then that Vec3f object is destroyed when the function ends. You never actually change the class variables.

Much the same in your Scene::Scene constuctor.
Last edited on
Yes! I changed it to
1
2
3
4
5
6
void Scene::set_bgColour(float newRed, float newGreen, float newBlue) {
    bgColour [0] = newRed;
    bgColour [1] = newGreen;
    bgColour [2] = newBlue;
    cout << bgColour;
}

and it works fine, I knew it would be something facepalmey.

Thanks for your help!
Topic archived. No new replies allowed.