Fusionando muchos contenedores con algoritmos STL

I have a lot of lists, vectors, sets ... (what ever you prefer) of pointers called RealAlgebraicNumberPtr to certain class. They are sorted.

I want to merge them and of course I want to do it fast and efficient.

¿Cual es la mejor opción? std::merge ? O tal vez std::set? I can provide both an < and == ordering.

¿Alguna idea?

preguntado el 08 de noviembre de 11 a las 11:11

std::merge is for merging containers. How would you use std::set? -

I don't. Its meant as a xor ;) -

@VJo: you podría insert the contents of all your containers into a set, ending up with an ordered, merged collection of all of them without duplicates (or with duplicates if you use multiset instead). However, since the containers are sorted to begin with, std::merge será más eficiente. -

2 Respuestas

Como se mencionó, std::merge esta bien

Only for std::list, you can profit from the optimization that std::list::merge función miembro implements: it splices the list nodes from the source into the target. That way, the source list will become empty, but it will avoid resource (re)allocation

Re: std::set

you could in fact std::merge into a std::set to get unique values in one go. With generic merge, duplicate values are not filtered, but the result is sorted, so you could apply std::unique to the result. If you expect a lot of duplicates, you might be quicker using a std::set

respondido 08 nov., 11:15

std::merge is as efficient as it gets. Which underlying container you use depends on your requirements. std::vector has the smallest memory-overhead of all standard containers, so if your data are large, you should stick with that.

Si utiliza std::vector, tiene que resize the target-vector before merging to avoid reallocations (you should be able to calculate the required size up-front), instead of using an std::back_inserter.

respondido 08 nov., 11:16

"std::vector has the smallest memory-overhead of all standard containers" - although because of the contiguity requirement, and the fact that vector copies as it reallocates, it does not follow from this that std::vector is the container that can hold the most elements when you're using a large proportion of available memory. That depends whether (virtual) address space is fragmented and how the vector's capacity grows, deque could win. - Steve Jessop

@BjörnPollex Instead of recommending resize, Sugiero reserve en combinación con std::back_inserter: best of both worlds: less error-prone (off-by-n reserved size, e.g.) and negligable performance difference - sehe

No es la respuesta que estás buscando? Examinar otras preguntas etiquetadas or haz tu propia pregunta.