We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
git commit: 20fd237
[2023-12-13 03:01:38,951] ERROR LEAK: ByteBuf.release() was not called before it's garbage-collected. See https://netty.io/wiki/reference-counted-objects.html for more information. Recent access records: Created at: io.netty.buffer.PooledByteBufAllocator.newDirectBuffer(PooledByteBufAllocator.java:403) io.netty.buffer.AbstractByteBufAllocator.directBuffer(AbstractByteBufAllocator.java:188) io.netty.buffer.AbstractByteBufAllocator.directBuffer(AbstractByteBufAllocator.java:179) com.automq.stream.s3.DirectByteBufAlloc.byteBuffer(DirectByteBufAlloc.java:50) com.automq.stream.s3.DirectByteBufAlloc.byteBuffer(DirectByteBufAlloc.java:42) com.automq.stream.s3.StreamRecordBatchCodec.encode(StreamRecordBatchCodec.java:35) com.automq.stream.s3.model.StreamRecordBatch.encoded(StreamRecordBatch.java:42) com.automq.stream.s3.S3Storage.append(S3Storage.java:256) com.automq.stream.s3.S3Stream.append0(S3Stream.java:169) com.automq.stream.s3.S3Stream.lambda$append$0(S3Stream.java:150) com.automq.stream.utils.FutureUtil.exec(FutureUtil.java:61) com.automq.stream.s3.S3Stream.append(S3Stream.java:146) kafka.log.streamaspect.AlwaysSuccessClient$StreamImpl.append0(AlwaysSuccessClient.java:267) kafka.log.streamaspect.AlwaysSuccessClient$StreamImpl.append(AlwaysSuccessClient.java:259) kafka.log.streamaspect.LazyStream.append(LazyStream.java:120) kafka.log.streamaspect.DefaultElasticStreamSlice.append(DefaultElasticStreamSlice.java:79) kafka.log.streamaspect.ElasticLogFileRecords.append(ElasticLogFileRecords.java:174) kafka.log.streamaspect.ElasticLogSegment.append(ElasticLogSegment.scala:106) kafka.log.streamaspect.ElasticLog.append(ElasticLog.scala:189) kafka.log.UnifiedLog.$anonfun$append$2(UnifiedLog.scala:955) kafka.log.LocalLog$.maybeHandleIOException(LocalLog.scala:820) kafka.log.streamaspect.ElasticUnifiedLog.maybeHandleIOException(ElasticUnifiedLog.scala:59) kafka.log.UnifiedLog.append(UnifiedLog.scala:827) kafka.log.UnifiedLog.appendAsLeader(UnifiedLog.scala:766) kafka.cluster.Partition.$anonfun$appendRecordsToLeader$1(Partition.scala:1295) kafka.cluster.Partition.appendRecordsToLeader(Partition.scala:1279) kafka.server.ReplicaManager.$anonfun$appendToLocalLog$6(ReplicaManager.scala:996) scala.collection.StrictOptimizedMapOps.map(StrictOptimizedMapOps.scala:28) scala.collection.StrictOptimizedMapOps.map$(StrictOptimizedMapOps.scala:27) scala.collection.mutable.HashMap.map(HashMap.scala:35) kafka.server.ReplicaManager.appendToLocalLog(ReplicaManager.scala:984) kafka.server.ReplicaManager.appendRecords(ReplicaManager.scala:642) kafka.server.KafkaApis.doAppendRecords$1(KafkaApis.scala:768) kafka.server.KafkaApis.handleProduceRequest(KafkaApis.scala:779) kafka.server.KafkaApis.handle(KafkaApis.scala:251) kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:77) java.base/java.lang.Thread.run(Thread.java:840) (io.netty.util.ResourceLeakDetector)
[2023-12-13 03:01:56,712] ERROR LEAK: ByteBuf.release() was not called before it's garbage-collected. See https://netty.io/wiki/reference-counted-objects.html for more information. Recent access records: Created at: io.netty.buffer.PooledByteBufAllocator.newDirectBuffer(PooledByteBufAllocator.java:403) io.netty.buffer.AbstractByteBufAllocator.directBuffer(AbstractByteBufAllocator.java:188) io.netty.buffer.AbstractByteBufAllocator.directBuffer(AbstractByteBufAllocator.java:179) com.automq.stream.s3.DirectByteBufAlloc.byteBuffer(DirectByteBufAlloc.java:50) com.automq.stream.s3.DirectByteBufAlloc.byteBuffer(DirectByteBufAlloc.java:42) com.automq.stream.s3.StreamRecordBatchCodec.duplicateDecode(StreamRecordBatchCodec.java:60) com.automq.stream.s3.ObjectReader$DataBlock$1.next(ObjectReader.java:313) com.automq.stream.s3.ObjectReader$DataBlock$1.next(ObjectReader.java:301) com.automq.stream.s3.cache.DataBlockRecords.complete(DataBlockRecords.java:49) com.automq.stream.s3.cache.DataBlockReadAccumulator.lambda$readDataBlock$1(DataBlockReadAccumulator.java:83) java.base/java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) java.base/java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:510) java.base/java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2147) com.automq.stream.s3.operator.DefaultS3Operator$MergedReadTask.handleReadCompleted(DefaultS3Operator.java:738) com.automq.stream.s3.operator.DefaultS3Operator.lambda$tryMergeRead0$3(DefaultS3Operator.java:246) com.automq.stream.utils.FutureUtil.suppress(FutureUtil.java:37) com.automq.stream.s3.operator.DefaultS3Operator.lambda$tryMergeRead0$4(DefaultS3Operator.java:246) java.base/java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) java.base/java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:510) java.base/java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2147) com.automq.stream.s3.operator.DefaultS3Operator.lambda$acquireReadPermit$38(DefaultS3Operator.java:625) java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) java.base/java.lang.Thread.run(Thread.java:840) (io.netty.util.ResourceLeakDetector)
The text was updated successfully, but these errors were encountered:
fixed by AutoMQ/automq-for-rocketmq#830
Sorry, something went wrong.
feat(s3stream): simplify operation counter metrics (#553)
861401c
Signed-off-by: Shichao Nie <[email protected]>
d2e406b
superhx
No branches or pull requests
git commit: 20fd237
The text was updated successfully, but these errors were encountered: