Is it reasonable to abandon raw pointer in favor of smart pointers in modern cpp for all programs

Hi people,

I think with presence of good features, smart pointers deliver to us it's time to abandon raw pointers usage for all programs! Why not?

Needless to say, there are precious few situations like embedded systems programming where we're required to toy with old programming stuff in the light of poor/limited equipment of the target device.

Apart from scenarios like this, I don't see any rationale not to leave out raw pointers, especially when we're approaching C++20!
Last edited on
You're babbling to the choir, dude.
The main problem with adoption of smart pointers, to name just one aspect of modern C++, is the inertia of "it's good enough and it works." Or "this is what I know, I don't have time to learn more."

Programming students are still being taught a lot of out-dated, if not outright wrong, things -- example, random numbers using the C libraries instead of C++ libraries.

Learn to program books for the most part are outdated from the minute they are published, and a lot of books are not that great. Even if they do try to teach the basics.

I'm constantly seeing new uses of what modern C++ has to offer here, over what I thought I knew was "The One True Way."

BTW, this is a topic that should probably be in the Lounge since any who respond will be talking opinion.

Some will insist they are speaking fact, but they aren't.
Yep. Also a lot of out-dated/wrong info on the internet. A problem is that beginners don't know the good from the bad from the ugly (Apologies to the film :) )

Pointers remain useful in cases where the component holding the pointer does not own the pointed-to object.

A unique_ptr claims sole ownership;
A shared_ptr claims part ownership; and
A "raw" pointer claims no ownership.

Raw pointers round out C++'s ownership model.
> Is it reasonable to abandon raw pointer in favor of smart pointers in modern cpp for all programs

No.

Give in to these silly demands and the people who long for the abolishment of raw pointers would next want to do away with iterators. After all, the usual iterator is in effect a (typically user-defined) raw pointer with no ownership semantics whatsoever.

...a lot of out-dated/wrong info on the internet.

Oh, my, absolutely. Even for non-noobs who don't live and breath C/C++ programming (*raises hand*) it is hard get solid information on new features of the evolving C++ standard from time to time.

Raw pointers have their uses, so do smart pointers.

For a hobby (and learning) I take some older code and muck around with it, trying to see if I can use a newer standard C++ feature than the code was written for. Most times successful, some times not so successful.

Each bit of code I learned something. Which is all good.
raw pointers are also nice when you don't have dynamic memory.
int foo;
int * fp = &foo;
^^what smarts does it NEED? Its smart enough in the raw, and smart pointers DO have overhead.

Nerfing the language to keep beginners out of trouble is how we got new languages that lack features. We have those, in spades. Leave C++ core features alone :)

Add to it, smart pointers are... guess what? A class wrapper for raw pointers. If you take raw pointers out, the only way to build this tool is with assembly language, now? Or a wrapper around C code?
Last edited on
I've actually rarely have problems using raw pointers (at least compared to other students..). I assume it was because I learned from the learncpp.com tutorial. The class that "taught" us about pointers was very vague, and most students just wanted to die. They did get their wish to die granted though, since the class right after that the professor decided pointers are the best thing to ever exist and would write the most atrocious code.

I only use smart pointers when I think the program is going to become complicated. Smart pointers, the random library, lists, maps, debugging techniques, etc.. None of those were ever taught. Even vectors were never taught, but they were so popular that eventually some professors allowed their use.


Quick story:

I was coding an assignment for a lab in my very first CS class. We had access to the computers inside the room, but I preferred my laptop which was faster and had VS on it. Eventually, I realize they want us to use a VLA which VS doesn't support. I figured I could sneak a vector in there instead, which he noticed ! But he let me off the hook after I explained ;o
In my workplace, we've agreed not only to dump raw pointers (smart pointers for memory management, references for just meaning some other variable) in all new code except when constrained by legacy; we have also committed to no more loops with manual index values (unless you truly need to know the absolute posirion of the current container element, and even then there's probably a better way). Range based loops and std algorithms all the way.
that seems excessive. when the loop counter is part of the computation, the original for loop seems better to me. if the loop counter serves no purpose other than inside a [i] or not even used, the range based seem better. And I am not even sure if a comma / multi operation for loop (eg for(int i = 3, j = 4, i*j<k; j++, i+=5) type thing is supported or for(int i = 0; i < max; cout << i++ endl); type things. I haven't tried to get too snarky with the ranged based... and maybe that kind of stuff isnt useful outside of codechef anyway :P

Its probably a LOT prettier your way. I can see and admit that raw * isnt required to write code most of the time. The places I keep using it are external interfaces... one tool I use wants char* strings sent to it, and it isn't going to take anything else this century.
Last edited on
@Repeater,

I'm sure you already know, C++20 adds a new variation to an range-based for loop. Allows to initialize a variable within the body of the loop definition.

for (init-statement; element_declaration : array) statement;

While that doesn't allow for direct indexing of the array, it could be used for a counter type computation.
Range-based loops are deliberately designed as an extremely simple construct: they are appropriate when all that we want to do is access every element in the sequence. This certainly is the most common usage scenario: our aim is to go through every item in the sequence, one after the other.

The classical for loop offers far greater control over iteration. For instance, where we are interested in making structural changes to the container or we want to access two sequences in parallel, we would use a classical for loop.

One size does not fit all; why write code that deliberately refuses to use one kind of for loop at all, even in situations where it would have lead to cleaner, more understandable code?
"One size does not fit all; why write code that deliberately refuses to use one kind of for loop at all, even in situations where it would have lead to cleaner, more understandable code? "

Why indeed. We've allowed the explicit index where it's necessary, and forbidden it elsewhere. As you suggest, the aim is cleaner, more understandable code.

Writing a loop with an explicit index says "I am choosing to create this index because for some reason I need it", because otherwise I would be using a std algorithm or a range-based loop or something else that's cleaner and more understandable. Those std algorithms make code SO understandable. If someone says to me "I am choosing to create this index because for some reason I need it" and then they DON'T use it I have to now wonder why they have lied to me; did they mean to use it and forget, did the code change and now it makes no sense, did they somehow not know about the better ways to write this (and thus I need to be on my guard because this programmer doesn't know about some fairly basic things).

Everyone is but a product of their experience. I work on software that's had over a decade of development with a rotating cast of fifty or so developers, about half of whom had zero professional experience and inadeqate review and supervision; ninety percent of the crashes are caused by using raw pointers to bad objects, and over and over if someone had just used a smart pointer, it would have been safe and clean and clear. Over and over, there are index-based loops that obscure what they're doing and manually handle (and sometimes fumble) their indices (sometimes more than one index at a time). Over and over, replacing it with a range-based loop or something from std algorithms created safer, faster, cleaner, more expressive code.

So we've switched to "safe, clean, expressive, efficient" by default, which means "smart pointers for memory handling by default, no manual looping by default", and it's made the code safer, faster, cleaner, more expressive.

It has come with an additional benefit; people who previously stayed in their comfort zone and did all their own manual handling and never used std algorithms and so on now do. They start simple and small, but soon enough it's fluid and natural, and then they start thinking and designing in std algorithm and friends, and combining them, and writing in two or three safe clean lines what used to take them twenty or thirty awkward linesm and their ability to deliver good code simply goes up a couple of levels. In our experience, by removing options to write worse code, people learn to write better code.
Last edited on
now, that I can agree to, it sounds like a good place to work ^^
and yea CS grads are woefully unprepped these days. I felt like I could do anything that needed doing when I graduated. Now, I don't think I could do 20% of what needs doing with the same background. The field has just grown randomly in every direction.
Last edited on
Topic archived. No new replies allowed.