Using extended ASCII in C++.

Hi,

So I was curious if you could do something like this:

#define ∑ 1

Now I mean something like this for the following reason, when I trying compiling the code it says the following:

error: macro names must be identifiers

Now granted, ∑ means sum of and has nothing to do with 1, but, baby-steps. I first need to see if it is possible to ∑ in the first place. So basically, what I'm getting at is can you use extended ASCII in C++ code as is?

I know you probably say make a sumOf method or function but I want to know if there is a way to use it exactly as ∑!
Last edited on
C identifiers match the following regular expression: [_A-Za-z][0-9_A-Za-z]*
In other words, they may only contain alphanumeric characters or underscores, they must be at least one character long, and the first character cannot be a number. '∑' fails the first condition.

Furthermore, sources that use characters beyond ASCII are unportable. Some compilers accept them (for example, in strings), most don't.
Technically, what I just said is inaccurate, but who cares about EBCDIC?
Is there any work around this then?
No. What you want to do is simply not allowed by the language.
Then how do other languages such as APL do it?
I assume they allow it.
So if I wanted to do something like this I would need to make a compiler or interpreter that would accept those characters?
I suppose, but then why not just use one of those languages that do allow it, instead of going through all that trouble?
Last edited on
Well, I was thinking about using define and if that would've worked wouldn't be so, so, bad, (I am trying to find another project to embark on that will last the summer), second of all, I only have like less than 60$ in total, and therefore I can't afford APL or languages like that. But, since interpreters and compilers are too complicated for me right now, (and even if I spent the time to learn it these next couple years or so) I guess I'll find another project to go for.
I honestly don't understand why that one character is so critical.

I'm sure there are free implementations of the language.
Maybe, but this was mostly for the project aspect of it.
Why not just:
 
#define SIGMA 1 
?

Although I can't fathom why you would want to do this.
Last edited on
This:
But, since interpreters and compilers are too complicated for me right now

And the fact that you chose the pseudonym 'Codeboy' amuses me.
Last edited on
What you can do is to write a preprocessor program that converts the unicode characters into alphanumeric sequences, which then calls the real compiler with the modified source code.
I still don't understand what the goal of defining sigma to an integer value might be.

I would think defining it as SIGMA the same way you would define PI would be adequate.
Topic archived. No new replies allowed.