Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make WithLatency visable for better custmization #2827

Open
medusar opened this issue Apr 15, 2024 · 2 comments
Open

Make WithLatency visable for better custmization #2827

medusar opened this issue Apr 15, 2024 · 2 comments

Comments

@medusar
Copy link

medusar commented Apr 15, 2024

Feature Request

Is your feature request related to a problem? Please describe

I am using the dispatch method to send redis requests and I would like to monitor the command completion latency,
I have read the code and found that when latency recording is enabled, the lettuce will create another object wrapping my command, this will create too many objects especially when there is a high volume of requests.

I have seen the code and found that if the command is an instance of WithLatency, it will not create another object, which is a good way to avoid object creation.

But the WithLatency interface can't be reached out of the package because it is only visable from the same package.

Describe the solution you'd like

Make the WithLatency interface visable out side its package so that users can custmize their own commands.

Describe alternatives you've considered

There is no other alternatives for now.

Teachability, Documentation, Adoption, Migration Strategy

If the WithLatency interface is visable, we can implement our own commands which reduces objects creation.

@tishun
Copy link
Collaborator

tishun commented Apr 26, 2024

Hey @medusar can you please elaborate on what you mean by

when latency recording is enabled, the lettuce will create another object wrapping my command

Which approach of latency tracking are you talking about?

@tishun tishun added the status: waiting-for-feedback We need additional information before we can continue label Apr 26, 2024
@medusar
Copy link
Author

medusar commented Apr 29, 2024

Hey @medusar can you please elaborate on what you mean by

when latency recording is enabled, the lettuce will create another object wrapping my command

Which approach of latency tracking are you talking about?

I am using the micrometer, the code is like this:

 MicrometerOptions micrometerOptions = MicrometerOptions.builder()
                .enable()
                .histogram(true)
                .localDistinction(false)
                .targetPercentiles(new double[]{0.5, 0.90, 0.99, 0.999})
                .build();
        ClientResources clientResources = ClientResources.builder()
                .commandLatencyRecorder(new MicrometerCommandLatencyRecorder(meterRegistry, micrometerOptions))
                .build();

By "when latency recording is enabled, the lettuce will create another object wrapping my command", I have checked the code, when commandLatencyRecorder is used, the command will be wrapped in another object, the code is in CommandHandler.potentiallyWrapLatencyCommand:

private RedisCommand<?, ?, ?> potentiallyWrapLatencyCommand(RedisCommand<?, ?, ?> command) {

        if (!latencyMetricsEnabled) {
            return command;
        }

        if (command instanceof WithLatency) {

            WithLatency withLatency = (WithLatency) command;

            withLatency.firstResponse(-1);
            withLatency.sent(nanoTime());

            return command;
        }

        LatencyMeteredCommand<?, ?, ?> latencyMeteredCommand = new LatencyMeteredCommand<>(command);
        latencyMeteredCommand.firstResponse(-1);
        latencyMeteredCommand.sent(nanoTime());

        return latencyMeteredCommand;
    }

@tishun tishun added status: waiting-for-triage and removed status: waiting-for-feedback We need additional information before we can continue labels Apr 30, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants