Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: consider adding prioritization #47

Closed
twpayne opened this issue Nov 26, 2024 · 4 comments
Closed

feat: consider adding prioritization #47

twpayne opened this issue Nov 26, 2024 · 4 comments

Comments

@twpayne
Copy link

twpayne commented Nov 26, 2024

Nice library!

PriorityChannel and BufferedPriorityChannel allow you to re-order values flowing through a channel by priority. This is really handy for when you have a channel of work items and want to work on the most important items first.

Would you be interested in a PR that adds this functionality to rill? No worries if you're either not interested or would rather implement this yourself.

@destel
Copy link
Owner

destel commented Nov 26, 2024

Thank you very much!

I actually thought about implementing something like this later, but I'd be happy to accept a contribution.

Here's a very very rough outline of the plan I had. The idea was to have something more generic - different kind of buffers for channels. Like built-in channels have FIFO buffers, in some cases it could be beneficial to have LIFO buffer or PriorityQueue buffer.

  1. Define a generic Buffer inside of the internal package. Something like
type Buffer[T any] interface {
    Read (T, bool)
    Write(T)
    CanWrite() bool
    CanShrink()
    Shrink()
}
  1. Define some sort of WithCusomBuffer function inside of the internal package. I think it would be something similar to this https://github.com/destel/rill/blob/main/internal/core/delay.go#L10

  2. Define PriorityQueueBuffer that implements the above interface inside of the internal package. Originally I planned to base this on stdlib's heap, or this small generic port of it https://gist.github.com/destel/7cd4637026a13f4d60bc90ebf105dd36

  3. In the main package. Define PriorityQueueFunction. Could be something like:

func PriorityQueue[A any](in <- chan Try[A], capacity int, less func(A,A) bool) <- chan Try[A] {

    lessWithErrs := func(a1, a2 Try[A]) bool {
        // always prioritize errors
        if a1.Error != nil {
            return true
        }
        if a2.Error != nil {
            return false
        }
        
        return less(a1.Value, a2.Value)
    }
    
    buf := inernal.NewPriorityQueueBuffer(capacity, lessWithErrs)
    
    return internal.WithCustomBuffer(in, buf)
}

What do you think about such design? I'd prefer to keep the zero-dependencies approach for now, but I'm very interested in your thoughts on this.

@twpayne
Copy link
Author

twpayne commented Nov 26, 2024

Like built-in channels have FIFO buffers, in some cases it could be beneficial to have LIFO buffer or PriorityQueue buffer.

The trouble with a LIFO channel buffer is that if the rate of writes exceeds the rate of reads then the memory consumption is unbounded. I'm also not sure when you would need a LIFO channel buffer.

Buffered priority channels make more sense as they're effectively "give me the most important value from the last N that were written to the channel" which is bounded in memory.

So, until there are other meaningful channel buffer strategies, I don't think it makes sense to add the Buffer[T any] interface.

I'd prefer to keep the zero-dependencies approach for now, but I'm very interested in your thoughts on this.

Re zero dependencies, github.com/twpayne/go-heap is MIT licensed, which means you are free to copy the code into this repository. I would happily open a PR for this (to keep the author credit) or you could include it with a Co-authored-by: line in the git commit message (which would also maintain the author credit).

@destel
Copy link
Owner

destel commented Nov 26, 2024

I thought again about it. I think I'll implement this myself a bit later. I agree with your point on LIFO use cases, still I want to further explore those idea of various buffer types and where it can lead to. Thanks for the permission to copy your heap implementation. If I do so, of course I'll include all necessary credit and mentions.

Thank you for the suggestion and for taking your time! This is definitely valuable functionality to add to Rill.

@twpayne
Copy link
Author

twpayne commented Nov 26, 2024

Thanks! Closing this issue for now.

@twpayne twpayne closed this as completed Nov 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants