I have a Receiver object that will sometimes accumulate a queue of packets that will be consumed when processed. It seems reasonable to make this receiver have an iterator protocol, so next( receiver ) will manually retrieve the next packet, and for packet in receiver will iterate through the currently available packets. Intuitively, it would be fine to do such an iteration once, going through all the available packets until the receiver stops the for loop by raising StopIteration (which is the standard way for iterators to tell for loops it's time to stop), and then later use such a for loop again to go through whatever new packets have arrived in the interim.
However, Python docs say:
Once an iterator’s
__next__()method raisesStopIteration, it must continue to do so on subsequent calls. Implementations that do not obey this property are deemed broken.
Even though this code is supposedly "deemed broken" it works just fine, as far as I can tell. So I'm wondering how bad is it for me to have code that seemingly works fine, and in the way one would intuitively expect an iterator to be able to work, but is somehow "deemed broken"? Is there something that's actually broken about returning more items after you've raised StopIteration? Is there some reason I should change this?
(I recognize that I could make the receiver be a mere iterable (whose __iter__ method would produce some other iterator) rather than an iterator itself (with its own __next__ method), but (a) this wouldn't support the familiar intuitive use of next( receiver ) to pop the next packet off the queue, and (b) it seems wasteful and inefficient to repeatedly spawn new iterator objects when I already have a perfectly fine iterator-like object whose only fault is that it is apparently "deemed broken", and (c) it would be misleading to present the receiver as a sort of iterable container since the receiver consumes the packets as it retrieves them (behavior built into the C-library that I'm making a Python wrapper for, and I don't think it makes sense for me to start caching them in Python too), so if somebody tried to make multiple iterators to traverse the receiver's queue at their own pace, the iterators would steal items from each other and yield much more confusing results than anything that I can see arising from my presenting this as a single stop-and-go iterator rather than as an iterable container.)
Another (now-deleted) answer pointed out that Python's built-in file objects produce an iterator that can be, and often is, restarted after stopping, which is some evidence that stop-and-go iterators can be perfectly functional, just not the way that they officially say iterators "should" work.
Here's another simple example illustrating that an iterator can liven up again after raising
StopIteration. But this doesn't explain why python docs discourage doing things this way, nor what the hidden dangers (if any) might be of doing so.Try it online!
This example uses a single iterator called
it. The latter twoforloops illustrate that this iterator can continue working even after it has raisedStopIterationto halt the earlierforloops.