diff --git a/CODE_OF_CONDUCT.md b/CODE_OF_CONDUCT.md index 2e863762fd..cb23547755 100644 --- a/CODE_OF_CONDUCT.md +++ b/CODE_OF_CONDUCT.md @@ -1,5 +1,5 @@ -This code of conduct applies to all spaces provided by the OpenSource project including in code, documentation, issue trackers, mailing lists, chat channels, wikis, blogs, social media and any other communication channels used by the project. +This code of conduct applies to all spaces provided by the OpenSource project including in code, documentation, issue trackers, mailing lists, chat channels, wikis, blogs, social media, events, conferences, meetings, and any other communication channels used by the project. **Our open source communities endeavor to:** @@ -8,7 +8,6 @@ This code of conduct applies to all spaces provided by the OpenSource project in * Be Respectful: We are committed to encouraging differing viewpoints, accepting constructive criticism and work collaboratively towards decisions that help the project grow. Disrespectful and unacceptable behavior will not be tolerated. * Be Collaborative: We are committed to supporting what is best for our community and users. When we build anything for the benefit of the project, we should document the work we do and communicate to others on how this affects their work. - **Our Responsibility. As contributors, members, or bystanders we each individually have the responsibility to behave professionally and respectfully at all times. Disrespectful and unacceptable behaviors include, but are not limited to:** * The use of violent threats, abusive, discriminatory, or derogatory language; @@ -19,6 +18,7 @@ This code of conduct applies to all spaces provided by the OpenSource project in * Publishing private information, such as physical or electronic address, without permission; * Other conduct which could reasonably be considered inappropriate in a professional setting; * Advocating for or encouraging any of the above behaviors. -* Enforcement and Reporting Code of Conduct Issues: + +**Enforcement and Reporting Code of Conduct Issues:** Instances of abusive, harassing, or otherwise unacceptable behavior may be reported. [Contact us](mailto:opensource-codeofconduct@amazon.com). All complaints will be reviewed and investigated and will result in a response that is deemed necessary and appropriate to the circumstances. diff --git a/TRIAGING.md b/TRIAGING.md new file mode 100644 index 0000000000..a4a25e1932 --- /dev/null +++ b/TRIAGING.md @@ -0,0 +1,73 @@ +Data Prepper + +The Data Prepper maintainers seek to promote an inclusive and engaged community of contributors. +In order to facilitate this, weekly triage meetings are open to all and attendance is encouraged for anyone who hopes to contribute, discuss an issue, or learn more about the project. +To learn more about contributing to the Data Prepper project visit the [Contributing](./CONTRIBUTING.md) documentation. + +### Do I need to attend for my issue to be addressed/triaged? + +Attendance is not required for your issue to be triaged or addressed. +All new issues are triaged weekly. + +### What happens if my issue does not get covered this time? + +Each meeting we seek to address all new issues. +However, should we run out of time before your issue is discussed, you are always welcome to attend the next meeting or to follow up on the issue post itself. + +### How do I join the triage meeting? + +Meetings are hosted regularly Tuesdays at 2:30 PM US Central Time (12:30 PM Pacific Time) and can be joined via the links posted on the [OpenSearch Meetup Group](https://www.meetup.com/opensearch/events/) list of events. +The event will be titled `Data Prepper Triage Meeting`. + +After joining the Zoom meeting, you can enable your video / voice to join the discussion. +If you do not have a webcam or microphone available, you can still join in via the text chat. + +If you have an issue you'd like to bring forth please consider getting a link to the issue so it can be presented to everyone in the meeting. + +### Is there an agenda for each week? + +Meetings are 30 minutes and structured as follows: + +1. Initial Gathering: As we gather, feel free to turn on video and engage in informal and open-to-all conversation. A volunteer Data Prepper maintainer will share the [Data Prepper Tracking Board](https://github.com/orgs/opensearch-project/projects/82/) and proceed. +2. Announcements: We will make any announcements at the beginning, if necessary. +3. Untriaged issues: We will review all untriaged [issues](https://github.com/orgs/opensearch-project/projects/82/views/6) for the Data Prepper repository. If you have an item here, you may spend a few minutes to explain your request. +4. Member Requests: Opportunity for any meeting member to ask for consideration of an issue or pull request. +5. Release review: If time permits, and we find it necessary, we will review [items for the current release](https://github.com/orgs/opensearch-project/projects/82/views/14). +6. Follow-up issues: If time permits, we will review the [follow up items](https://github.com/orgs/opensearch-project/projects/82/views/18). +7. Open Discussion: If time permits, allow for members of the meeting to surface any topics without issues filed or pull request created. + +### Do I need to have already contributed to the project to attend a triage meeting? + +No, all are welcome and encouraged to attend. +Attending the triage meetings is a great way for a new contributor to learn about the project as well as explore different avenues of contribution. + +### What if I have follow-up questions on an issue? + +If you have an existing issue you would like to discuss, you can always comment on the issue itself. +Alternatively, you are welcome to come to the triage meeting to discuss. + +### Is this meeting a good place to get help using Data Prepper? + +While we are always happy to help the community, the best resource for usage questions is the [the Data Prepper discussion forum](https://github.com/opensearch-project/data-prepper/discussions) on GitHub. + +There you can find answers to many common questions as well as speak with implementation experts and Data Prepper maintainers. + +### What are the issue labels associated with triaging? + +There are several labels that are particularly important for triaging in Data Prepper: + +| Label | When applied | Meaning | +| ----- | ------------ | ------- | +| [untriaged](https://github.com/opensearch-project/data-prepper/labels/untriaged) | When issues are created or re-opened. | Issues labeled as `untriaged` require the attention of the repository maintainers and may need to be prioritized for quicker resolution. It's crucial to keep the count of 'untriaged' labels low to ensure all potential security issues are addressed in a timely manner. | +| [follow up](https://github.com/opensearch-project/data-prepper/labels/follow%20up) | During triage meetings. | Issues labeled as `follow up` have been triaged. However, the maintainers may need to follow up further on it. This tag lets us triage an issue as not critical, but also be able to come back to it. +| [help wanted](https://github.com/opensearch-project/data-prepper/labels/help%20wanted) | Anytime. | Issues marked as `help wanted` signal that they are actionable and not the current focus of the project maintainers. Community contributions are especially encouraged for these issues. | +| [good first issue](https://github.com/opensearch-project/data-prepper/labels/good%20first%20issue) | Anytime. | Issues labeled as `good first issue` are small in scope and can be resolved with a single pull request. These are recommended starting points for newcomers looking to make their first contributions. | + + +### Is this where I should bring up potential security vulnerabilities? + +Due to the sensitive nature of security vulnerabilities, please report all potential vulnerabilities directly by following the steps outlined in the [Security Issue Response Process](https://github.com/opensearch-project/data-prepper/security/policy). + +### Who should I contact if I have further questions? + +You can always file an [issue](https://github.com/opensearch-project/data-prepper/issues/new/choose) for any question you have about the project. diff --git a/build.gradle b/build.gradle index f4bbccbcc2..3dccd497cf 100644 --- a/build.gradle +++ b/build.gradle @@ -69,7 +69,7 @@ subprojects { } } dependencies { - implementation platform('com.fasterxml.jackson:jackson-bom:2.16.1') + implementation platform('com.fasterxml.jackson:jackson-bom:2.17.2') implementation platform('org.eclipse.jetty:jetty-bom:9.4.53.v20231009') implementation platform('io.micrometer:micrometer-bom:1.10.5') implementation libs.guava.core @@ -226,6 +226,9 @@ subprojects { test { useJUnitPlatform() + javaLauncher = javaToolchains.launcherFor { + languageVersion = JavaLanguageVersion.current() + } reports { junitXml.required html.required diff --git a/data-prepper-api/build.gradle b/data-prepper-api/build.gradle index 4ee8e7316e..045d331704 100644 --- a/data-prepper-api/build.gradle +++ b/data-prepper-api/build.gradle @@ -12,11 +12,11 @@ dependencies { implementation 'com.fasterxml.jackson.core:jackson-databind' implementation 'com.fasterxml.jackson.datatype:jackson-datatype-jsr310' implementation 'com.fasterxml.jackson.datatype:jackson-datatype-jdk8' - implementation 'org.apache.parquet:parquet-common:1.14.0' + implementation libs.parquet.common testImplementation 'com.fasterxml.jackson.dataformat:jackson-dataformat-yaml' implementation libs.commons.lang3 testImplementation project(':data-prepper-test-common') - testImplementation 'org.skyscreamer:jsonassert:1.5.1' + testImplementation 'org.skyscreamer:jsonassert:1.5.3' testImplementation libs.commons.io } diff --git a/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/acknowledgements/AcknowledgementSetManager.java b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/acknowledgements/AcknowledgementSetManager.java index 69c07c4aa5..6afebeaa91 100644 --- a/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/acknowledgements/AcknowledgementSetManager.java +++ b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/acknowledgements/AcknowledgementSetManager.java @@ -5,9 +5,6 @@ package org.opensearch.dataprepper.model.acknowledgements; -import org.opensearch.dataprepper.model.event.Event; -import org.opensearch.dataprepper.model.event.EventHandle; - import java.time.Duration; import java.util.function.Consumer; @@ -29,32 +26,4 @@ public interface AcknowledgementSetManager { * @since 2.2 */ AcknowledgementSet create(final Consumer callback, final Duration timeout); - - /** - * Releases an event's reference - * - * @param eventHandle event handle - * @param success indicates negative or positive acknowledgement - * - * @since 2.2 - */ - void releaseEventReference(final EventHandle eventHandle, boolean success); - - /** - * Acquires an event's reference - * - * @param eventHandle event handle - * - * @since 2.2 - */ - void acquireEventReference(final EventHandle eventHandle); - - /** - * Acquires an event's reference - * - * @param event event - * - * @since 2.2 - */ - void acquireEventReference(final Event event); } diff --git a/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/configuration/PluginSetting.java b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/configuration/PluginSetting.java index a8ea4a3ee1..61db9a3c7e 100644 --- a/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/configuration/PluginSetting.java +++ b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/configuration/PluginSetting.java @@ -5,10 +5,22 @@ package org.opensearch.dataprepper.model.configuration; +import org.opensearch.dataprepper.model.annotations.DataPrepperPlugin; + import java.util.Collections; import java.util.List; import java.util.Map; +/** + * Deprecated class for getting plugin settings. + *

+ * Only projects within data-prepper-core should use this. It is currently used + * extensively in plugin framework to load plugins. In Data Prepper 3.0 this + * class will be moved into data-prepper-core and not exposed to plugins anymore. + * + * @deprecated Use {@link DataPrepperPlugin#pluginConfigurationType()} or {@link PipelineDescription} instead. + */ +@Deprecated public class PluginSetting implements PipelineDescription { private static final String UNEXPECTED_ATTRIBUTE_TYPE_MSG = "Unexpected type [%s] for attribute [%s]"; diff --git a/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/AbstractEventHandle.java b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/AbstractEventHandle.java new file mode 100644 index 0000000000..2ca40fbe59 --- /dev/null +++ b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/AbstractEventHandle.java @@ -0,0 +1,52 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.model.event; + +import java.util.ArrayList; +import java.util.List; +import java.time.Instant; +import java.util.function.BiConsumer; + +abstract class AbstractEventHandle implements EventHandle, InternalEventHandle { + private Instant externalOriginationTime; + private final Instant internalOriginationTime; + private List> releaseConsumers; + + AbstractEventHandle(final Instant internalOriginationTime) { + this.externalOriginationTime = null; + this.internalOriginationTime = internalOriginationTime; + this.releaseConsumers = new ArrayList<>(); + } + @Override + public void setExternalOriginationTime(final Instant externalOriginationTime) { + this.externalOriginationTime = externalOriginationTime; + } + + @Override + public Instant getInternalOriginationTime() { + return this.internalOriginationTime; + } + + @Override + public Instant getExternalOriginationTime() { + return this.externalOriginationTime; + } + + @Override + public void onRelease(BiConsumer releaseConsumer) { + synchronized (releaseConsumers) { + releaseConsumers.add(releaseConsumer); + } + } + + public void notifyReleaseConsumers(boolean result) { + synchronized (releaseConsumers) { + for (final BiConsumer consumer: releaseConsumers) { + consumer.accept(this, result); + } + } + } +} diff --git a/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/AggregateEventHandle.java b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/AggregateEventHandle.java new file mode 100644 index 0000000000..921d689a3c --- /dev/null +++ b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/AggregateEventHandle.java @@ -0,0 +1,77 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.model.event; + +import org.opensearch.dataprepper.model.acknowledgements.AcknowledgementSet; +import java.lang.ref.WeakReference; + +import java.util.ArrayList; +import java.util.HashSet; +import java.util.List; +import java.util.Set; +import java.time.Instant; +import java.io.Serializable; + +public class AggregateEventHandle extends AbstractEventHandle implements Serializable { + private List> acknowledgementSetRefList; + private Set acknowledgementSetHashes; + + public AggregateEventHandle(final Instant internalOriginationTime) { + super(internalOriginationTime); + this.acknowledgementSetRefList = new ArrayList<>(); + this.acknowledgementSetHashes = new HashSet<>(); + } + + @Override + public void addAcknowledgementSet(final AcknowledgementSet acknowledgementSet) { + int hashCode = acknowledgementSet.hashCode(); + if (!acknowledgementSetHashes.contains(hashCode)) { + this.acknowledgementSetRefList.add(new WeakReference<>(acknowledgementSet)); + acknowledgementSetHashes.add(hashCode); + } + } + + @Override + public boolean hasAcknowledgementSet() { + return acknowledgementSetRefList.size() != 0; + } + + @Override + public void acquireReference() { + synchronized (this) { + for (WeakReference acknowledgementSetRef: acknowledgementSetRefList) {; + AcknowledgementSet acknowledgementSet = acknowledgementSetRef.get(); + if (acknowledgementSet != null) { + acknowledgementSet.acquire(this); + } + } + } + } + + @Override + public boolean release(boolean result) { + notifyReleaseConsumers(result); + boolean returnValue = true; + synchronized (this) { + for (WeakReference acknowledgementSetRef: acknowledgementSetRefList) { + AcknowledgementSet acknowledgementSet = acknowledgementSetRef.get(); + if (acknowledgementSet != null) { + acknowledgementSet.release(this, result); + } else { + returnValue = false; + } + } + } + return returnValue; + } + + // For testing + List> getAcknowledgementSetRefs() { + return acknowledgementSetRefList; + } + +} + diff --git a/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/DefaultEventHandle.java b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/DefaultEventHandle.java index 743309bf75..340c104a14 100644 --- a/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/DefaultEventHandle.java +++ b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/DefaultEventHandle.java @@ -8,35 +8,22 @@ import org.opensearch.dataprepper.model.acknowledgements.AcknowledgementSet; import java.lang.ref.WeakReference; -import java.util.ArrayList; -import java.util.List; -import java.util.function.BiConsumer; import java.time.Instant; import java.io.Serializable; -public class DefaultEventHandle implements EventHandle, InternalEventHandle, Serializable { - private Instant externalOriginationTime; - private final Instant internalOriginationTime; +public class DefaultEventHandle extends AbstractEventHandle implements Serializable { private WeakReference acknowledgementSetRef; - private List> releaseConsumers; public DefaultEventHandle(final Instant internalOriginationTime) { + super(internalOriginationTime); this.acknowledgementSetRef = null; - this.externalOriginationTime = null; - this.internalOriginationTime = internalOriginationTime; - this.releaseConsumers = new ArrayList<>(); } @Override - public void setAcknowledgementSet(final AcknowledgementSet acknowledgementSet) { + public void addAcknowledgementSet(final AcknowledgementSet acknowledgementSet) { this.acknowledgementSetRef = new WeakReference<>(acknowledgementSet); } - @Override - public void setExternalOriginationTime(final Instant externalOriginationTime) { - this.externalOriginationTime = externalOriginationTime; - } - public AcknowledgementSet getAcknowledgementSet() { if (acknowledgementSetRef == null) { return null; @@ -45,32 +32,30 @@ public AcknowledgementSet getAcknowledgementSet() { } @Override - public Instant getInternalOriginationTime() { - return this.internalOriginationTime; + public boolean hasAcknowledgementSet() { + AcknowledgementSet acknowledgementSet = getAcknowledgementSet(); + return acknowledgementSet != null; } @Override - public Instant getExternalOriginationTime() { - return this.externalOriginationTime; + public void acquireReference() { + synchronized (this) { + AcknowledgementSet acknowledgementSet = getAcknowledgementSet(); + if (acknowledgementSet != null) { + acknowledgementSet.acquire(this); + } + } } @Override - public void release(boolean result) { - synchronized (releaseConsumers) { - for (final BiConsumer consumer: releaseConsumers) { - consumer.accept(this, result); - } - } + public boolean release(boolean result) { + notifyReleaseConsumers(result); AcknowledgementSet acknowledgementSet = getAcknowledgementSet(); if (acknowledgementSet != null) { acknowledgementSet.release(this, result); + return true; } + return false; } - @Override - public void onRelease(BiConsumer releaseConsumer) { - synchronized (releaseConsumers) { - releaseConsumers.add(releaseConsumer); - } - } } diff --git a/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/Event.java b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/Event.java index 740447ecc0..e0e36d9237 100644 --- a/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/Event.java +++ b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/Event.java @@ -26,6 +26,15 @@ */ public interface Event extends Serializable { + /** + * Adds or updates the key with a given value in the Event + * + * @param key where the value will be set + * @param value value to set the key to + * @since 2.8 + */ + void put(EventKey key, Object value); + /** * Adds or updates the key with a given value in the Event * @@ -35,6 +44,17 @@ public interface Event extends Serializable { */ void put(String key, Object value); + /** + * Retrieves the given key from the Event + * + * @param key the value to retrieve from + * @param clazz the return type of the value + * @param The type + * @return T a clazz object from the key + * @since 2.8 + */ + T get(EventKey key, Class clazz); + /** * Retrieves the given key from the Event * @@ -46,6 +66,17 @@ public interface Event extends Serializable { */ T get(String key, Class clazz); + /** + * Retrieves the given key from the Event as a List + * + * @param key the value to retrieve from + * @param clazz the return type of elements in the list + * @param The type + * @return {@literal List} a list of clazz elements + * @since 2.8 + */ + List getList(EventKey key, Class clazz); + /** * Retrieves the given key from the Event as a List * @@ -57,6 +88,14 @@ public interface Event extends Serializable { */ List getList(String key, Class clazz); + /** + * Deletes the given key from the Event + * + * @param key the field to be deleted + * @since 2.8 + */ + void delete(EventKey key); + /** * Deletes the given key from the Event * @@ -87,6 +126,15 @@ public interface Event extends Serializable { */ JsonNode getJsonNode(); + /** + * Gets a serialized Json string of the specific key in the Event + * + * @param key the field to be returned + * @return Json string of the field + * @since 2.8 + */ + String getAsJsonString(EventKey key); + /** * Gets a serialized Json string of the specific key in the Event * @@ -104,6 +152,15 @@ public interface Event extends Serializable { */ EventMetadata getMetadata(); + /** + * Checks if the key exists. + * + * @param key name of the key to look for + * @return returns true if the key exists, otherwise false + * @since 2.8 + */ + boolean containsKey(EventKey key); + /** * Checks if the key exists. * @@ -113,6 +170,15 @@ public interface Event extends Serializable { */ boolean containsKey(String key); + /** + * Checks if the value stored for the key is list + * + * @param key name of the key to look for + * @return returns true if the key is a list, otherwise false + * @since 2.8 + */ + boolean isValueAList(EventKey key); + /** * Checks if the value stored for the key is list * diff --git a/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/EventHandle.java b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/EventHandle.java index d05dd8e36c..898384c32e 100644 --- a/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/EventHandle.java +++ b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/EventHandle.java @@ -14,9 +14,10 @@ public interface EventHandle { * * @param result result to be used while releasing. This indicates if * the operation on the event handle is success or not + * @return returns true if the event handle is released successful, false otherwise * @since 2.2 */ - void release(boolean result); + boolean release(boolean result); /** * sets external origination time diff --git a/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/EventKey.java b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/EventKey.java new file mode 100644 index 0000000000..9086f0f641 --- /dev/null +++ b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/EventKey.java @@ -0,0 +1,21 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.model.event; + +/** + * Model class to represent a key into a Data Prepper {@link Event}. + * + * @since 2.9 + */ +public interface EventKey { + /** + * The original key provided as a string. + * + * @return The key as a string + * @since 2.9 + */ + String getKey(); +} diff --git a/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/EventKeyConfiguration.java b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/EventKeyConfiguration.java new file mode 100644 index 0000000000..c35e8db38c --- /dev/null +++ b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/EventKeyConfiguration.java @@ -0,0 +1,34 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.model.event; + +import java.lang.annotation.Documented; +import java.lang.annotation.ElementType; +import java.lang.annotation.Retention; +import java.lang.annotation.RetentionPolicy; +import java.lang.annotation.Target; + +/** + * An annotation for an {@link EventKey} used in a Data Prepper pipeline configuration. + *

+ * Unless you need all actions on a configuration, you should use this annotation to + * provide the most appropriate validation. + * + * @since 2.9 + */ +@Documented +@Retention(RetentionPolicy.RUNTIME) +@Target({ElementType.FIELD}) +public @interface EventKeyConfiguration { + /** + * Defines the {@link EventKeyFactory.EventAction}s to use when creating the {@link EventKey} + * for the configuration. + * + * @return The desired event actions. + * @since 2.9 + */ + EventKeyFactory.EventAction[] value(); +} diff --git a/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/EventKeyFactory.java b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/EventKeyFactory.java new file mode 100644 index 0000000000..e7cbc25463 --- /dev/null +++ b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/EventKeyFactory.java @@ -0,0 +1,71 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.model.event; + +import java.util.Arrays; +import java.util.Collections; +import java.util.EnumSet; +import java.util.List; +import java.util.Set; + +/** + * A factory for producing {@link EventKey} objects. + * + * @since 2.9 + */ +public interface EventKeyFactory { + /** + * Creates an {@link EventKey} with given actions. + * + * @param key The key + * @param forActions Actions to support + * @return The EventKey + * @since 2.9 + */ + EventKey createEventKey(String key, EventAction... forActions); + + /** + * Creates an {@link EventKey} for the default actions, which are all. + * + * @param key The key + * @return The EventKey + * @since 2.9 + */ + default EventKey createEventKey(final String key) { + return createEventKey(key, EventAction.ALL); + } + + /** + * An action on an Event. + * + * @since 2.9 + */ + enum EventAction { + GET, + DELETE, + PUT, + ALL(GET, DELETE, PUT); + + private final List includedActions; + + EventAction(EventAction... eventActions) { + includedActions = Arrays.asList(eventActions); + + } + + boolean isMutableAction() { + return this != GET; + } + + Set getSupportedActions() { + final EnumSet supportedActions = EnumSet.noneOf(EventAction.class); + supportedActions.add(this); + supportedActions.addAll(includedActions); + + return Collections.unmodifiableSet(supportedActions); + } + } +} diff --git a/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/InternalEventHandle.java b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/InternalEventHandle.java index 3817365f17..3ee88f698b 100644 --- a/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/InternalEventHandle.java +++ b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/InternalEventHandle.java @@ -9,20 +9,27 @@ public interface InternalEventHandle { /** - * sets acknowledgement set + * adds acknowledgement set * * @param acknowledgementSet acknowledgementSet to be set in the event handle - * @since 2.6 + * @since 2.9 */ - void setAcknowledgementSet(final AcknowledgementSet acknowledgementSet); + void addAcknowledgementSet(final AcknowledgementSet acknowledgementSet); /** - * gets acknowledgement set + * Indicates if the event handle has atleast one acknowledgement set * - * @return returns acknowledgementSet from the event handle - * @since 2.6 + * @return returns true if there is at least one acknowledgementSet in the event handle + * @since 2.9 */ - AcknowledgementSet getAcknowledgementSet(); + boolean hasAcknowledgementSet(); + + /** + * Acquires reference to acknowledgement set(s) in the event handle + * + * @since 2.9 + */ + void acquireReference(); } diff --git a/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/JacksonEvent.java b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/JacksonEvent.java index 9ef34bb82c..25ef31ec8b 100644 --- a/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/JacksonEvent.java +++ b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/JacksonEvent.java @@ -28,8 +28,8 @@ import java.io.ObjectInputStream; import java.time.Instant; import java.util.ArrayList; -import java.util.Arrays; import java.util.Collections; +import java.util.Deque; import java.util.HashMap; import java.util.Iterator; import java.util.LinkedList; @@ -38,8 +38,8 @@ import java.util.Objects; import java.util.StringJoiner; -import static com.google.common.base.Preconditions.checkArgument; import static com.google.common.base.Preconditions.checkNotNull; +import static org.opensearch.dataprepper.model.event.JacksonEventKey.trimTrailingSlashInKey; /** * A Jackson Implementation of {@link Event} interface. This implementation relies heavily on JsonNode to manage the keys of the event. @@ -137,20 +137,15 @@ public JsonNode getJsonNode() { return jsonNode; } - /** - * Adds or updates the key with a given value in the Event. - * - * @param key where the value will be set - * @param value value to set the key to - * @since 1.2 - */ @Override - public void put(final String key, final Object value) { - checkArgument(!key.isEmpty(), "key cannot be an empty string for put method"); + public void put(EventKey key, Object value) { + final JacksonEventKey jacksonEventKey = asJacksonEventKey(key); - final String trimmedKey = checkAndTrimKey(key); + if(!jacksonEventKey.supports(EventKeyFactory.EventAction.PUT)) { + throw new IllegalArgumentException("key cannot be an empty string for put method"); + } - final LinkedList keys = new LinkedList<>(Arrays.asList(trimmedKey.split(SEPARATOR, -1))); + final Deque keys = new LinkedList<>(jacksonEventKey.getKeyPathList()); JsonNode parentNode = jsonNode; @@ -166,6 +161,19 @@ public void put(final String key, final Object value) { } } + /** + * Adds or updates the key with a given value in the Event. + * + * @param key where the value will be set + * @param value value to set the key to + * @since 1.2 + */ + @Override + public void put(final String key, final Object value) { + final JacksonEventKey jacksonEventKey = new JacksonEventKey(key, true, EventKeyFactory.EventAction.PUT); + put(jacksonEventKey, value); + } + @Override public EventHandle getEventHandle() { return eventHandle; @@ -189,6 +197,27 @@ private JsonNode getOrCreateNode(final JsonNode node, final String key) { return childNode; } + @Override + public T get(EventKey key, Class clazz) { + JacksonEventKey jacksonEventKey = asJacksonEventKey(key); + + final JsonNode node = getNode(jacksonEventKey); + if (node.isMissingNode()) { + return null; + } + + return mapNodeToObject(key.getKey(), node, clazz); + } + + private static JacksonEventKey asJacksonEventKey(EventKey key) { + if(!(key instanceof JacksonEventKey)) { + throw new IllegalArgumentException("The key provided must be obtained through the EventKeyFactory."); + } + + JacksonEventKey jacksonEventKey = (JacksonEventKey) key; + return jacksonEventKey; + } + /** * Retrieves the value of type clazz from the key. * @@ -200,15 +229,8 @@ private JsonNode getOrCreateNode(final JsonNode node, final String key) { */ @Override public T get(final String key, final Class clazz) { - - final String trimmedKey = checkAndTrimKey(key); - - final JsonNode node = getNode(trimmedKey); - if (node.isMissingNode()) { - return null; - } - - return mapNodeToObject(key, node, clazz); + final JacksonEventKey jacksonEventKey = new JacksonEventKey(key, true, EventKeyFactory.EventAction.GET); + return get(jacksonEventKey, clazz); } private JsonNode getNode(final String key) { @@ -216,6 +238,10 @@ private JsonNode getNode(final String key) { return jsonNode.at(jsonPointer); } + private JsonNode getNode(final JacksonEventKey key) { + return jsonNode.at(key.getJsonPointer()); + } + private T mapNodeToObject(final String key, final JsonNode node, final Class clazz) { try { return mapper.treeToValue(node, clazz); @@ -225,6 +251,18 @@ private T mapNodeToObject(final String key, final JsonNode node, final Class } } + @Override + public List getList(EventKey key, Class clazz) { + JacksonEventKey jacksonEventKey = asJacksonEventKey(key); + + final JsonNode node = getNode(jacksonEventKey); + if (node.isMissingNode()) { + return null; + } + + return mapNodeToList(jacksonEventKey.getKey(), node, clazz); + } + /** * Retrieves the given key from the Event as a List * @@ -236,15 +274,8 @@ private T mapNodeToObject(final String key, final JsonNode node, final Class */ @Override public List getList(final String key, final Class clazz) { - - final String trimmedKey = checkAndTrimKey(key); - - final JsonNode node = getNode(trimmedKey); - if (node.isMissingNode()) { - return null; - } - - return mapNodeToList(key, node, clazz); + JacksonEventKey jacksonEventKey = new JacksonEventKey(key, true, EventKeyFactory.EventAction.GET); + return getList(jacksonEventKey, clazz); } private List mapNodeToList(final String key, final JsonNode node, final Class clazz) { @@ -267,16 +298,15 @@ private JsonPointer toJsonPointer(final String key) { return JsonPointer.compile(jsonPointerExpression); } - /** - * Deletes the key from the event. - * - * @param key the field to be deleted - */ @Override - public void delete(final String key) { + public void delete(final EventKey key) { + final JacksonEventKey jacksonEventKey = asJacksonEventKey(key); + + if(!jacksonEventKey.supports(EventKeyFactory.EventAction.DELETE)) { + throw new IllegalArgumentException("key cannot be an empty string for delete method"); + } - checkArgument(!key.isEmpty(), "key cannot be an empty string for delete method"); - final String trimmedKey = checkAndTrimKey(key); + final String trimmedKey = jacksonEventKey.getTrimmedKey(); final int index = trimmedKey.lastIndexOf(SEPARATOR); JsonNode baseNode = jsonNode; @@ -293,6 +323,17 @@ public void delete(final String key) { } } + /** + * Deletes the key from the event. + * + * @param key the field to be deleted + */ + @Override + public void delete(final String key) { + final JacksonEventKey jacksonEventKey = new JacksonEventKey(key, true, EventKeyFactory.EventAction.DELETE); + delete(jacksonEventKey); + } + @Override public void clear() { // Delete all entries from the event @@ -309,16 +350,22 @@ public String toJsonString() { } @Override - public String getAsJsonString(final String key) { - final String trimmedKey = checkAndTrimKey(key); + public String getAsJsonString(EventKey key) { - final JsonNode node = getNode(trimmedKey); + JacksonEventKey jacksonEventKey = asJacksonEventKey(key); + final JsonNode node = getNode(jacksonEventKey); if (node.isMissingNode()) { return null; } return node.toString(); } + @Override + public String getAsJsonString(final String key) { + JacksonEventKey jacksonEventKey = new JacksonEventKey(key, true, EventKeyFactory.EventAction.GET); + return getAsJsonString(jacksonEventKey); + } + /** * returns a string with formatted parts replaced by their values. The input * string may contain parts with format "${.../.../...}" which are replaced @@ -402,24 +449,35 @@ public EventMetadata getMetadata() { } @Override - public boolean containsKey(final String key) { - - final String trimmedKey = checkAndTrimKey(key); + public boolean containsKey(EventKey key) { + JacksonEventKey jacksonEventKey = asJacksonEventKey(key); - final JsonNode node = getNode(trimmedKey); + final JsonNode node = getNode(jacksonEventKey); return !node.isMissingNode(); } @Override - public boolean isValueAList(final String key) { - final String trimmedKey = checkAndTrimKey(key); + public boolean containsKey(final String key) { + JacksonEventKey jacksonEventKey = new JacksonEventKey(key, true, EventKeyFactory.EventAction.GET); + return containsKey(jacksonEventKey); + } - final JsonNode node = getNode(trimmedKey); + @Override + public boolean isValueAList(EventKey key) { + JacksonEventKey jacksonEventKey = asJacksonEventKey(key); + + final JsonNode node = getNode(jacksonEventKey); return node.isArray(); } + @Override + public boolean isValueAList(final String key) { + JacksonEventKey jacksonEventKey = new JacksonEventKey(key, true, EventKeyFactory.EventAction.GET); + return isValueAList(jacksonEventKey); + } + @Override public Map toMap() { return mapper.convertValue(jsonNode, MAP_TYPE_REFERENCE); @@ -427,30 +485,7 @@ public Map toMap() { public static boolean isValidEventKey(final String key) { - try { - checkKey(key); - return true; - } catch (final Exception e) { - return false; - } - } - private String checkAndTrimKey(final String key) { - checkKey(key); - return trimTrailingSlashInKey(key); - } - - private static void checkKey(final String key) { - checkNotNull(key, "key cannot be null"); - if (key.isEmpty()) { - // Empty string key is valid - return; - } - if (key.length() > MAX_KEY_LENGTH) { - throw new IllegalArgumentException("key cannot be longer than " + MAX_KEY_LENGTH + " characters"); - } - if (!isValidKey(key)) { - throw new IllegalArgumentException("key " + key + " must contain only alphanumeric chars with .-_@/ and must follow JsonPointer (ie. 'field/to/key')"); - } + return JacksonEventKey.isValidEventKey(key); } private String trimKey(final String key) { @@ -459,31 +494,6 @@ private String trimKey(final String key) { return trimTrailingSlashInKey(trimmedLeadingSlash); } - private String trimTrailingSlashInKey(final String key) { - return key.length() > 1 && key.endsWith(SEPARATOR) ? key.substring(0, key.length() - 1) : key; - } - - private static boolean isValidKey(final String key) { - for (int i = 0; i < key.length(); i++) { - char c = key.charAt(i); - - if (!(c >= 48 && c <= 57 - || c >= 65 && c <= 90 - || c >= 97 && c <= 122 - || c == '.' - || c == '-' - || c == '_' - || c == '@' - || c == '/' - || c == '[' - || c == ']')) { - - return false; - } - } - return true; - } - /** * Constructs an empty builder. * diff --git a/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/JacksonEventKey.java b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/JacksonEventKey.java new file mode 100644 index 0000000000..50d59a6585 --- /dev/null +++ b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/JacksonEventKey.java @@ -0,0 +1,194 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.model.event; + +import com.fasterxml.jackson.core.JsonPointer; + +import java.util.Arrays; +import java.util.Collections; +import java.util.EnumSet; +import java.util.List; +import java.util.Objects; +import java.util.Set; + +import static com.google.common.base.Preconditions.checkNotNull; + +class JacksonEventKey implements EventKey { + private static final String SEPARATOR = "/"; + private static final int MAX_KEY_LENGTH = 2048; + private final String key; + private final EventKeyFactory.EventAction[] eventActions; + private final String trimmedKey; + private List keyPathList; + private JsonPointer jsonPointer; + private final Set supportedActions; + + /** + * Constructor for the JacksonEventKey which should only be used by implementation + * of {@link EventKeyFactory} in Data Prepper core. + * + * @param key The key + * @param eventActions Event actions to support + */ + JacksonEventKey(final String key, final EventKeyFactory.EventAction... eventActions) { + this(key, false, eventActions); + } + + /** + * Constructs a new JacksonEventKey. + *

+ * This overload should only be used by {@link JacksonEvent} directly. It allows for skipping creating + * some resources knowing they will not be needed. The {@link JacksonEvent} only needs a JSON pointer + * when performing GET event actions. So we can optimize PUT/DELETE actions when called with a string + * key instead of an EventKey by not creating the JSON Pointer at all. + *

+ * For EventKey's constructed through the factory, we should not perform lazy initialization since + * we may lose some possible validations. + * + * @param key the key + * @param lazy Use true to lazily initialize. This will not be thread-safe, however. + * @param eventActions Event actions to support + */ + JacksonEventKey(final String key, final boolean lazy, final EventKeyFactory.EventAction... eventActions) { + this.key = Objects.requireNonNull(key, "Parameter key cannot be null for EventKey."); + this.eventActions = eventActions.length == 0 ? new EventKeyFactory.EventAction[] { EventKeyFactory.EventAction.ALL } : eventActions; + + supportedActions = EnumSet.noneOf(EventKeyFactory.EventAction.class); + for (final EventKeyFactory.EventAction eventAction : this.eventActions) { + supportedActions.addAll(eventAction.getSupportedActions()); + } + + if(key.isEmpty()) { + for (final EventKeyFactory.EventAction action : this.eventActions) { + if (action.isMutableAction()) { + throw new IllegalArgumentException("Event key cannot be an empty string for " + action + " actions."); + } + } + } + + trimmedKey = checkAndTrimKey(key); + + if(!lazy) { + keyPathList = toKeyPathList(); + jsonPointer = toJsonPointer(trimmedKey); + } + } + + @Override + public String getKey() { + return key; + } + + String getTrimmedKey() { + return trimmedKey; + } + + List getKeyPathList() { + if(keyPathList == null) { + keyPathList = toKeyPathList(); + } + return keyPathList; + } + + JsonPointer getJsonPointer() { + if(jsonPointer == null) { + jsonPointer = toJsonPointer(trimmedKey); + } + return jsonPointer; + } + + boolean supports(final EventKeyFactory.EventAction eventAction) { + return supportedActions.contains(eventAction); + } + + @Override + public boolean equals(final Object other) { + if (this == other) + return true; + if (other == null || getClass() != other.getClass()) + return false; + final JacksonEventKey that = (JacksonEventKey) other; + return Objects.equals(key, that.key) && Arrays.equals(eventActions, that.eventActions); + } + + @Override + public int hashCode() { + return Objects.hash(key, Arrays.hashCode(eventActions)); + } + + @Override + public String toString() { + return key; + } + + private String checkAndTrimKey(final String key) { + checkKey(key); + return trimTrailingSlashInKey(key); + } + + private static void checkKey(final String key) { + checkNotNull(key, "key cannot be null"); + if (key.isEmpty()) { + // Empty string key is valid + return; + } + if (key.length() > MAX_KEY_LENGTH) { + throw new IllegalArgumentException("key cannot be longer than " + MAX_KEY_LENGTH + " characters"); + } + if (!isValidKey(key)) { + throw new IllegalArgumentException("key " + key + " must contain only alphanumeric chars with .-_@/ and must follow JsonPointer (ie. 'field/to/key')"); + } + } + + + static String trimTrailingSlashInKey(final String key) { + return key.length() > 1 && key.endsWith(SEPARATOR) ? key.substring(0, key.length() - 1) : key; + } + + private static boolean isValidKey(final String key) { + for (int i = 0; i < key.length(); i++) { + char c = key.charAt(i); + + if (!(c >= 48 && c <= 57 + || c >= 65 && c <= 90 + || c >= 97 && c <= 122 + || c == '.' + || c == '-' + || c == '_' + || c == '@' + || c == '/' + || c == '[' + || c == ']')) { + + return false; + } + } + return true; + } + + private List toKeyPathList() { + return Collections.unmodifiableList(Arrays.asList(trimmedKey.split(SEPARATOR, -1))); + } + + private static JsonPointer toJsonPointer(final String key) { + final String jsonPointerExpression; + if (key.isEmpty() || key.startsWith("/")) { + jsonPointerExpression = key; + } else { + jsonPointerExpression = SEPARATOR + key; + } + return JsonPointer.compile(jsonPointerExpression); + } + + static boolean isValidEventKey(final String key) { + try { + checkKey(key); + return true; + } catch (final Exception e) { + return false; + } + } +} diff --git a/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/sink/AbstractSink.java b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/sink/AbstractSink.java index 1c3e596265..26dd7e98a6 100644 --- a/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/sink/AbstractSink.java +++ b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/sink/AbstractSink.java @@ -28,6 +28,7 @@ public abstract class AbstractSink> implements Sink { private Thread retryThread; private int maxRetries; private int waitTimeMs; + private SinkThread sinkThread; public AbstractSink(final PluginSetting pluginSetting, int numRetries, int waitTimeMs) { this.pluginMetrics = PluginMetrics.fromPluginSetting(pluginSetting); @@ -51,7 +52,8 @@ public void initialize() { // the exceptions which are not retryable. doInitialize(); if (!isReady() && retryThread == null) { - retryThread = new Thread(new SinkThread(this, maxRetries, waitTimeMs)); + sinkThread = new SinkThread(this, maxRetries, waitTimeMs); + retryThread = new Thread(sinkThread); retryThread.start(); } } @@ -76,7 +78,7 @@ public void output(Collection records) { @Override public void shutdown() { if (retryThread != null) { - retryThread.stop(); + sinkThread.stop(); } } diff --git a/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/sink/SinkThread.java b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/sink/SinkThread.java index c304de37af..451cef7dff 100644 --- a/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/sink/SinkThread.java +++ b/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/sink/SinkThread.java @@ -10,6 +10,8 @@ class SinkThread implements Runnable { private int maxRetries; private int waitTimeMs; + private volatile boolean isStopped = false; + public SinkThread(AbstractSink sink, int maxRetries, int waitTimeMs) { this.sink = sink; this.maxRetries = maxRetries; @@ -19,11 +21,15 @@ public SinkThread(AbstractSink sink, int maxRetries, int waitTimeMs) { @Override public void run() { int numRetries = 0; - while (!sink.isReady() && numRetries++ < maxRetries) { + while (!sink.isReady() && numRetries++ < maxRetries && !isStopped) { try { Thread.sleep(waitTimeMs); sink.doInitialize(); } catch (InterruptedException e){} } } + + public void stop() { + isStopped = true; + } } diff --git a/data-prepper-api/src/test/java/org/opensearch/dataprepper/metrics/MetricsTestUtil.java b/data-prepper-api/src/test/java/org/opensearch/dataprepper/metrics/MetricsTestUtil.java index a77d9de349..f6c0602f9e 100644 --- a/data-prepper-api/src/test/java/org/opensearch/dataprepper/metrics/MetricsTestUtil.java +++ b/data-prepper-api/src/test/java/org/opensearch/dataprepper/metrics/MetricsTestUtil.java @@ -6,25 +6,37 @@ package org.opensearch.dataprepper.metrics; import io.micrometer.core.instrument.Measurement; +import io.micrometer.core.instrument.Meter; import io.micrometer.core.instrument.MeterRegistry; import io.micrometer.core.instrument.Metrics; import io.micrometer.core.instrument.Statistic; import io.micrometer.core.instrument.simple.SimpleMeterRegistry; +import java.util.ArrayList; +import java.util.HashSet; import java.util.List; +import java.util.Set; import java.util.stream.Collectors; import java.util.stream.StreamSupport; public class MetricsTestUtil { - public static void initMetrics() { - Metrics.globalRegistry.getRegistries().forEach(meterRegistry -> Metrics.globalRegistry.remove(meterRegistry)); - Metrics.globalRegistry.getMeters().forEach(meter -> Metrics.globalRegistry.remove(meter)); + public static synchronized void initMetrics() { + final Set registries = new HashSet<>(Metrics.globalRegistry.getRegistries()); + registries.forEach(Metrics.globalRegistry::remove); + + final List meters = new ArrayList<>(Metrics.globalRegistry.getMeters()); + meters.forEach(Metrics.globalRegistry::remove); + Metrics.addRegistry(new SimpleMeterRegistry()); } - public static List getMeasurementList(final String meterName) { - return StreamSupport.stream(getRegistry().find(meterName).meter().measure().spliterator(), false) + public static synchronized List getMeasurementList(final String meterName) { + final Meter meter = getRegistry().find(meterName).meter(); + if(meter == null) + throw new RuntimeException("No metrics meter is available for " + meterName); + + return StreamSupport.stream(meter.measure().spliterator(), false) .collect(Collectors.toList()); } diff --git a/data-prepper-api/src/test/java/org/opensearch/dataprepper/model/event/AggregateEventHandleTests.java b/data-prepper-api/src/test/java/org/opensearch/dataprepper/model/event/AggregateEventHandleTests.java new file mode 100644 index 0000000000..9998d6eb6d --- /dev/null +++ b/data-prepper-api/src/test/java/org/opensearch/dataprepper/model/event/AggregateEventHandleTests.java @@ -0,0 +1,102 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.model.event; + +import org.opensearch.dataprepper.model.acknowledgements.AcknowledgementSet; +import static org.hamcrest.MatcherAssert.assertThat; +import static org.hamcrest.Matchers.equalTo; +import org.junit.jupiter.api.Test; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.when; +import static org.mockito.Mockito.verify; +import static org.mockito.ArgumentMatchers.any; +import static org.mockito.Mockito.times; +import org.mockito.Mock; + +import java.lang.ref.WeakReference; +import java.time.Instant; + +class AggregateEventHandleTests { + @Mock + private AcknowledgementSet acknowledgementSet1; + @Mock + private AcknowledgementSet acknowledgementSet2; + private int count; + + @Test + void testBasic() { + Instant now = Instant.now(); + AggregateEventHandle eventHandle = new AggregateEventHandle(now); + assertThat(eventHandle.getInternalOriginationTime(), equalTo(now)); + assertThat(eventHandle.getExternalOriginationTime(), equalTo(null)); + assertThat(eventHandle.hasAcknowledgementSet(), equalTo(false)); + eventHandle.acquireReference(); + eventHandle.release(true); + } + + @Test + void testWithAcknowledgementSet() { + acknowledgementSet1 = mock(AcknowledgementSet.class); + acknowledgementSet2 = mock(AcknowledgementSet.class); + when(acknowledgementSet1.release(any(EventHandle.class), any(Boolean.class))).thenReturn(true); + when(acknowledgementSet2.release(any(EventHandle.class), any(Boolean.class))).thenReturn(true); + Instant now = Instant.now(); + AggregateEventHandle eventHandle = new AggregateEventHandle(now); + assertThat(eventHandle.getInternalOriginationTime(), equalTo(now)); + assertThat(eventHandle.getExternalOriginationTime(), equalTo(null)); + eventHandle.addAcknowledgementSet(acknowledgementSet1); + // just do duplicate add + eventHandle.addAcknowledgementSet(acknowledgementSet1); + assertThat(eventHandle.hasAcknowledgementSet(), equalTo(true)); + eventHandle.addAcknowledgementSet(acknowledgementSet2); + eventHandle.acquireReference(); + verify(acknowledgementSet1).acquire(eventHandle); + verify(acknowledgementSet2).acquire(eventHandle); + eventHandle.release(true); + verify(acknowledgementSet1).release(eventHandle, true); + verify(acknowledgementSet2).release(eventHandle, true); + } + + @Test + void testWithExternalOriginationTime() { + Instant now = Instant.now(); + AggregateEventHandle eventHandle = new AggregateEventHandle(now); + assertThat(eventHandle.hasAcknowledgementSet(), equalTo(false)); + assertThat(eventHandle.getInternalOriginationTime(), equalTo(now)); + assertThat(eventHandle.getExternalOriginationTime(), equalTo(null)); + eventHandle.setExternalOriginationTime(now.minusSeconds(60)); + assertThat(eventHandle.getExternalOriginationTime(), equalTo(now.minusSeconds(60))); + eventHandle.release(true); + } + + @Test + void testWithOnReleaseHandler() { + Instant now = Instant.now(); + count = 0; + AggregateEventHandle eventHandle = new AggregateEventHandle(now); + acknowledgementSet1 = mock(AcknowledgementSet.class); + acknowledgementSet2 = mock(AcknowledgementSet.class); + eventHandle.onRelease((handle, result) -> {if (result) count++; }); + eventHandle.addAcknowledgementSet(acknowledgementSet1); + assertThat(eventHandle.hasAcknowledgementSet(), equalTo(true)); + eventHandle.addAcknowledgementSet(acknowledgementSet2); + // Simulate weak reference object not available for + // verification tests to pass 100% + for (WeakReference acknowledgementSetRef: eventHandle.getAcknowledgementSetRefs()) { + if (acknowledgementSetRef.get() == acknowledgementSet2 ) { + acknowledgementSetRef.clear(); + break; + } + } + eventHandle.release(true); + assertThat(count, equalTo(1)); + verify(acknowledgementSet1, times(1)).release(eventHandle, true); + verify(acknowledgementSet2, times(0)).release(eventHandle, true); + + } + +} + diff --git a/data-prepper-api/src/test/java/org/opensearch/dataprepper/model/event/DefaultEventHandleTests.java b/data-prepper-api/src/test/java/org/opensearch/dataprepper/model/event/DefaultEventHandleTests.java index b2a66b2d1d..f351febd11 100644 --- a/data-prepper-api/src/test/java/org/opensearch/dataprepper/model/event/DefaultEventHandleTests.java +++ b/data-prepper-api/src/test/java/org/opensearch/dataprepper/model/event/DefaultEventHandleTests.java @@ -13,6 +13,7 @@ import static org.mockito.Mockito.when; import static org.mockito.Mockito.verify; import static org.mockito.ArgumentMatchers.any; +import static org.mockito.Mockito.doNothing; import org.mockito.Mock; import java.time.Instant; @@ -29,6 +30,8 @@ void testBasic() { assertThat(eventHandle.getAcknowledgementSet(), equalTo(null)); assertThat(eventHandle.getInternalOriginationTime(), equalTo(now)); assertThat(eventHandle.getExternalOriginationTime(), equalTo(null)); + eventHandle.acquireReference(); + assertThat(eventHandle.hasAcknowledgementSet(), equalTo(false)); eventHandle.release(true); } @@ -36,12 +39,16 @@ void testBasic() { void testWithAcknowledgementSet() { acknowledgementSet = mock(AcknowledgementSet.class); when(acknowledgementSet.release(any(EventHandle.class), any(Boolean.class))).thenReturn(true); + doNothing().when(acknowledgementSet).acquire(any(EventHandle.class)); Instant now = Instant.now(); DefaultEventHandle eventHandle = new DefaultEventHandle(now); assertThat(eventHandle.getAcknowledgementSet(), equalTo(null)); assertThat(eventHandle.getInternalOriginationTime(), equalTo(now)); assertThat(eventHandle.getExternalOriginationTime(), equalTo(null)); - eventHandle.setAcknowledgementSet(acknowledgementSet); + eventHandle.addAcknowledgementSet(acknowledgementSet); + assertThat(eventHandle.hasAcknowledgementSet(), equalTo(true)); + eventHandle.acquireReference(); + verify(acknowledgementSet).acquire(eventHandle); eventHandle.release(true); verify(acknowledgementSet).release(eventHandle, true); } diff --git a/data-prepper-api/src/test/java/org/opensearch/dataprepper/model/event/EventActionTest.java b/data-prepper-api/src/test/java/org/opensearch/dataprepper/model/event/EventActionTest.java new file mode 100644 index 0000000000..edb63fa663 --- /dev/null +++ b/data-prepper-api/src/test/java/org/opensearch/dataprepper/model/event/EventActionTest.java @@ -0,0 +1,69 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.model.event; + +import org.junit.jupiter.api.extension.ExtensionContext; +import org.junit.jupiter.params.ParameterizedTest; +import org.junit.jupiter.params.provider.Arguments; +import org.junit.jupiter.params.provider.ArgumentsProvider; +import org.junit.jupiter.params.provider.ArgumentsSource; +import org.junit.jupiter.params.provider.EnumSource; + +import java.util.stream.Stream; + +import static org.hamcrest.CoreMatchers.equalTo; +import static org.hamcrest.CoreMatchers.hasItem; +import static org.hamcrest.MatcherAssert.assertThat; +import static org.junit.jupiter.params.provider.Arguments.arguments; + +class EventActionTest { + @ParameterizedTest + @EnumSource(value = EventKeyFactory.EventAction.class, mode = EnumSource.Mode.EXCLUDE, names = {"GET"}) + void isMutableAction_is_true_for_mutable_actions(final EventKeyFactory.EventAction eventAction) { + assertThat(eventAction.isMutableAction(), equalTo(true)); + } + + @ParameterizedTest + @EnumSource(value = EventKeyFactory.EventAction.class, mode = EnumSource.Mode.INCLUDE, names = {"GET"}) + void isMutableAction_is_false_for_mutable_actions(final EventKeyFactory.EventAction eventAction) { + assertThat(eventAction.isMutableAction(), equalTo(false)); + } + + @ParameterizedTest + @EnumSource(value = EventKeyFactory.EventAction.class) + void getSupportedActions_includes_self(final EventKeyFactory.EventAction eventAction) { + assertThat(eventAction.getSupportedActions(), hasItem(eventAction)); + } + + @ParameterizedTest + @EnumSource(value = EventKeyFactory.EventAction.class) + void getSupportedActions_includes_for_all_actions_when_ALL(final EventKeyFactory.EventAction eventAction) { + assertThat(EventKeyFactory.EventAction.ALL.getSupportedActions(), hasItem(eventAction)); + } + + @ParameterizedTest + @ArgumentsSource(SupportsArgumentsProvider.class) + void supports_returns_expected_value(final EventKeyFactory.EventAction eventAction, final EventKeyFactory.EventAction otherAction, final boolean expectedSupports) { + assertThat(eventAction.getSupportedActions().contains(otherAction), equalTo(expectedSupports)); + } + + static class SupportsArgumentsProvider implements ArgumentsProvider { + @Override + public Stream provideArguments(final ExtensionContext extensionContext) throws Exception { + return Stream.of( + arguments(EventKeyFactory.EventAction.GET, EventKeyFactory.EventAction.PUT, false), + arguments(EventKeyFactory.EventAction.GET, EventKeyFactory.EventAction.DELETE, false), + arguments(EventKeyFactory.EventAction.GET, EventKeyFactory.EventAction.ALL, false), + arguments(EventKeyFactory.EventAction.PUT, EventKeyFactory.EventAction.GET, false), + arguments(EventKeyFactory.EventAction.PUT, EventKeyFactory.EventAction.DELETE, false), + arguments(EventKeyFactory.EventAction.PUT, EventKeyFactory.EventAction.ALL, false), + arguments(EventKeyFactory.EventAction.DELETE, EventKeyFactory.EventAction.GET, false), + arguments(EventKeyFactory.EventAction.DELETE, EventKeyFactory.EventAction.PUT, false), + arguments(EventKeyFactory.EventAction.DELETE, EventKeyFactory.EventAction.ALL, false) + ); + } + } +} \ No newline at end of file diff --git a/data-prepper-api/src/test/java/org/opensearch/dataprepper/model/event/EventKeyFactoryTest.java b/data-prepper-api/src/test/java/org/opensearch/dataprepper/model/event/EventKeyFactoryTest.java new file mode 100644 index 0000000000..c2ed2d56f3 --- /dev/null +++ b/data-prepper-api/src/test/java/org/opensearch/dataprepper/model/event/EventKeyFactoryTest.java @@ -0,0 +1,47 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.model.event; + +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.Mock; +import org.mockito.junit.jupiter.MockitoExtension; + +import java.util.UUID; + +import static org.hamcrest.CoreMatchers.equalTo; +import static org.hamcrest.MatcherAssert.assertThat; +import static org.mockito.ArgumentMatchers.anyString; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.when; + +@ExtendWith(MockitoExtension.class) +class EventKeyFactoryTest { + + private String keyPath; + + @Mock + private EventKey eventKey; + + @BeforeEach + void setUp() { + keyPath = UUID.randomUUID().toString(); + } + + private EventKeyFactory createObjectUnderTest() { + return mock(EventKeyFactory.class); + } + + @Test + void createEventKey_calls_with_ALL_action() { + final EventKeyFactory objectUnderTest = createObjectUnderTest(); + when(objectUnderTest.createEventKey(anyString())).thenCallRealMethod(); + when(objectUnderTest.createEventKey(keyPath, EventKeyFactory.EventAction.ALL)).thenReturn(eventKey); + + assertThat(objectUnderTest.createEventKey(keyPath), equalTo(eventKey)); + } +} \ No newline at end of file diff --git a/data-prepper-api/src/test/java/org/opensearch/dataprepper/model/event/JacksonEventKeyTest.java b/data-prepper-api/src/test/java/org/opensearch/dataprepper/model/event/JacksonEventKeyTest.java new file mode 100644 index 0000000000..5eb696a374 --- /dev/null +++ b/data-prepper-api/src/test/java/org/opensearch/dataprepper/model/event/JacksonEventKeyTest.java @@ -0,0 +1,284 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.model.event; + +import com.fasterxml.jackson.core.JsonPointer; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtensionContext; +import org.junit.jupiter.params.ParameterizedTest; +import org.junit.jupiter.params.provider.Arguments; +import org.junit.jupiter.params.provider.ArgumentsProvider; +import org.junit.jupiter.params.provider.ArgumentsSource; +import org.junit.jupiter.params.provider.CsvSource; +import org.junit.jupiter.params.provider.EnumSource; +import org.junit.jupiter.params.provider.ValueSource; + +import java.util.List; +import java.util.UUID; +import java.util.stream.Stream; + +import static org.hamcrest.CoreMatchers.equalTo; +import static org.hamcrest.CoreMatchers.not; +import static org.hamcrest.CoreMatchers.notNullValue; +import static org.hamcrest.CoreMatchers.sameInstance; +import static org.hamcrest.MatcherAssert.assertThat; +import static org.junit.jupiter.api.Assertions.assertThrows; +import static org.junit.jupiter.params.provider.Arguments.arguments; + +class JacksonEventKeyTest { + @Test + void constructor_throws_with_null_key() { + assertThrows(NullPointerException.class, () -> new JacksonEventKey(null)); + } + + @Test + void getKey_with_empty_string_for_GET() { + final JacksonEventKey objectUnderTest = new JacksonEventKey("", EventKeyFactory.EventAction.GET); + assertThat(objectUnderTest.getKey(), equalTo("")); + assertThat(objectUnderTest.getTrimmedKey(), equalTo("")); + assertThat(objectUnderTest.getKeyPathList(), notNullValue()); + assertThat(objectUnderTest.getKeyPathList(), equalTo(List.of(""))); + assertThat(objectUnderTest.getJsonPointer(), notNullValue()); + } + + @ParameterizedTest + @EnumSource(value = EventKeyFactory.EventAction.class, mode = EnumSource.Mode.EXCLUDE, names = {"GET"}) + void constructor_throws_with_empty_string_for_unsupported_actions(final EventKeyFactory.EventAction eventAction) { + assertThrows(IllegalArgumentException.class, () -> new JacksonEventKey("", eventAction)); + } + + + @ParameterizedTest + @ValueSource(strings = { + "inv(alid", + "getMetadata(\"test_key\")" + }) + void constructor_throws_with_invalid_key(final String key) { + assertThrows(IllegalArgumentException.class, () -> new JacksonEventKey(key)); + } + + @ParameterizedTest + @ValueSource(strings = { + "test_key", + "/test_key", + "key.with.dot", + "key-with-hyphen", + "key_with_underscore", + "key@with@at", + "key[with]brackets" + }) + void getKey_returns_expected_result(final String key) { + assertThat(new JacksonEventKey(key).getKey(), equalTo(key)); + } + + @ParameterizedTest + @CsvSource(value = { + "test_key, test_key", + "/test_key, /test_key", + "/test_key/, /test_key", + "key.with.dot, key.with.dot", + "key-with-hyphen, key-with-hyphen", + "key_with_underscore, key_with_underscore", + "key@with@at, key@with@at", + "key[with]brackets, key[with]brackets" + }) + void getTrimmedKey_returns_expected_result(final String key, final String expectedTrimmedKey) { + assertThat(new JacksonEventKey(key).getTrimmedKey(), equalTo(expectedTrimmedKey)); + } + + @ParameterizedTest + @ArgumentsSource(KeyPathListArgumentsProvider.class) + void getKeyPathList_returns_expected_value(final String key, final List expectedKeyPathList) { + assertThat(new JacksonEventKey(key).getKeyPathList(), equalTo(expectedKeyPathList)); + } + + @Test + void getJsonPointer_returns_a_valid_JsonPointer() { + final String testKey = UUID.randomUUID().toString(); + final JacksonEventKey objectUnderTest = new JacksonEventKey(testKey); + + final JsonPointer jsonPointer = objectUnderTest.getJsonPointer(); + assertThat(jsonPointer, notNullValue()); + assertThat(jsonPointer.toString(), equalTo("/" + testKey)); + } + + @Test + void getJsonPointer_returns_the_same_instance_for_multiple_calls() { + final String testKey = UUID.randomUUID().toString(); + final JacksonEventKey objectUnderTest = new JacksonEventKey(testKey); + + final JsonPointer jsonPointer = objectUnderTest.getJsonPointer(); + assertThat(objectUnderTest.getJsonPointer(), sameInstance(jsonPointer)); + assertThat(objectUnderTest.getJsonPointer(), sameInstance(jsonPointer)); + } + + @ParameterizedTest + @EnumSource(value = EventKeyFactory.EventAction.class) + void getJsonPointer_returns_valid_JsonPointer_when_constructed_with_fromJacksonEvent(final EventKeyFactory.EventAction eventAction) { + final String testKey = UUID.randomUUID().toString(); + final JacksonEventKey objectUnderTest = new JacksonEventKey(testKey, true, eventAction); + + final JsonPointer jsonPointer = objectUnderTest.getJsonPointer(); + assertThat(jsonPointer, notNullValue()); + assertThat(jsonPointer.toString(), equalTo("/" + testKey)); + } + + @ParameterizedTest + @ArgumentsSource(KeyPathListArgumentsProvider.class) + void getKeyPathList_returns_expected_value_when_constructed_with_fromJacksonEvent(final String key, final List expectedKeyPathList) { + assertThat(new JacksonEventKey(key, true).getKeyPathList(), equalTo(expectedKeyPathList)); + } + + @ParameterizedTest + @ArgumentsSource(SupportsArgumentsProvider.class) + void supports_returns_true_if_any_supports(final List eventActionsList, final EventKeyFactory.EventAction otherAction, final boolean expectedSupports) { + final String testKey = UUID.randomUUID().toString(); + final EventKeyFactory.EventAction[] eventActions = new EventKeyFactory.EventAction[eventActionsList.size()]; + eventActionsList.toArray(eventActions); + assertThat(new JacksonEventKey(testKey, eventActions).supports(otherAction), equalTo(expectedSupports)); + } + + @ParameterizedTest + @CsvSource(value = { + "test_key, true", + "/test_key, true", + "inv(alid, false", + "getMetadata(\"test_key\"), false", + "key.with.dot, true", + "key-with-hyphen, true", + "key_with_underscore, true", + "key@with@at, true", + "key[with]brackets, true" + }) + void isValidEventKey_returns_expected_result(final String key, final boolean isValid) { + assertThat(JacksonEventKey.isValidEventKey(key), equalTo(isValid)); + } + + + static class KeyPathListArgumentsProvider implements ArgumentsProvider { + @Override + public Stream provideArguments(final ExtensionContext extensionContext) { + return Stream.of( + arguments("test_key", List.of("test_key")), + arguments("a/b", List.of("a", "b")), + arguments("a/b/", List.of("a", "b")), + arguments("a/b/c", List.of("a", "b", "c")), + arguments("a/b/c/", List.of("a", "b", "c")) + ); + } + } + + static class SupportsArgumentsProvider implements ArgumentsProvider { + @Override + public Stream provideArguments(final ExtensionContext extensionContext) throws Exception { + return Stream.of( + arguments(List.of(), EventKeyFactory.EventAction.GET, true), + arguments(List.of(), EventKeyFactory.EventAction.PUT, true), + arguments(List.of(), EventKeyFactory.EventAction.DELETE, true), + arguments(List.of(), EventKeyFactory.EventAction.ALL, true), + arguments(List.of(EventKeyFactory.EventAction.GET), EventKeyFactory.EventAction.GET, true), + arguments(List.of(EventKeyFactory.EventAction.PUT), EventKeyFactory.EventAction.PUT, true), + arguments(List.of(EventKeyFactory.EventAction.DELETE), EventKeyFactory.EventAction.DELETE, true), + arguments(List.of(EventKeyFactory.EventAction.GET), EventKeyFactory.EventAction.PUT, false), + arguments(List.of(EventKeyFactory.EventAction.GET, EventKeyFactory.EventAction.PUT), EventKeyFactory.EventAction.PUT, true), + arguments(List.of(EventKeyFactory.EventAction.PUT, EventKeyFactory.EventAction.GET), EventKeyFactory.EventAction.PUT, true), + arguments(List.of(EventKeyFactory.EventAction.DELETE), EventKeyFactory.EventAction.PUT, false), + arguments(List.of(EventKeyFactory.EventAction.DELETE, EventKeyFactory.EventAction.GET), EventKeyFactory.EventAction.PUT, false), + arguments(List.of(EventKeyFactory.EventAction.DELETE, EventKeyFactory.EventAction.GET, EventKeyFactory.EventAction.PUT), EventKeyFactory.EventAction.PUT, true), + arguments(List.of(EventKeyFactory.EventAction.ALL), EventKeyFactory.EventAction.GET, true), + arguments(List.of(EventKeyFactory.EventAction.ALL), EventKeyFactory.EventAction.PUT, true), + arguments(List.of(EventKeyFactory.EventAction.ALL), EventKeyFactory.EventAction.DELETE, true) + ); + } + } + + @ParameterizedTest + @EnumSource(EventKeyFactory.EventAction.class) + void equals_returns_true_for_same_key_and_actions(final EventKeyFactory.EventAction eventAction) { + final String testKey = UUID.randomUUID().toString(); + final JacksonEventKey objectUnderTest = new JacksonEventKey(testKey, eventAction); + final JacksonEventKey other = new JacksonEventKey(testKey, eventAction); + + assertThat(objectUnderTest.equals(other), equalTo(true)); + } + + @Test + void equals_returns_true_for_same_instance() { + final JacksonEventKey objectUnderTest = new JacksonEventKey(UUID.randomUUID().toString(), + EventKeyFactory.EventAction.PUT); + + assertThat(objectUnderTest.equals(objectUnderTest), equalTo(true)); + } + + @Test + void equals_returns_false_for_null() { + final JacksonEventKey objectUnderTest = new JacksonEventKey(UUID.randomUUID().toString(), + EventKeyFactory.EventAction.PUT); + + assertThat(objectUnderTest.equals(null), equalTo(false)); + } + + @Test + void equals_returns_false_for_non_EventKey() { + final String testKey = UUID.randomUUID().toString(); + final JacksonEventKey objectUnderTest = new JacksonEventKey(testKey, + EventKeyFactory.EventAction.PUT); + + assertThat(objectUnderTest.equals(testKey), equalTo(false)); + } + + @Test + void equals_returns_false_for_same_key_but_different_actions() { + final String testKey = UUID.randomUUID().toString(); + final JacksonEventKey objectUnderTest = new JacksonEventKey(testKey, EventKeyFactory.EventAction.PUT); + final JacksonEventKey other = new JacksonEventKey(testKey, EventKeyFactory.EventAction.GET); + + assertThat(objectUnderTest.equals(other), equalTo(false)); + } + + @ParameterizedTest + @EnumSource(EventKeyFactory.EventAction.class) + void equals_returns_false_for_different_key_but_same_actions(final EventKeyFactory.EventAction eventAction) { + final JacksonEventKey objectUnderTest = new JacksonEventKey(UUID.randomUUID().toString(), eventAction); + final JacksonEventKey other = new JacksonEventKey(UUID.randomUUID().toString(), eventAction); + + assertThat(objectUnderTest.equals(other), equalTo(false)); + } + + @ParameterizedTest + @EnumSource(EventKeyFactory.EventAction.class) + void hashCode_is_the_same_for_same_key_and_actions(final EventKeyFactory.EventAction eventAction) { + final String testKey = UUID.randomUUID().toString(); + final JacksonEventKey objectUnderTest = new JacksonEventKey(testKey, eventAction); + final JacksonEventKey other = new JacksonEventKey(testKey, eventAction); + + assertThat(objectUnderTest.hashCode(), equalTo(other.hashCode())); + } + + @ParameterizedTest + @CsvSource({ + "test, PUT, test2, PUT", + "test, PUT, test2, PUT", + "test, PUT, test, GET" + }) + void hashCode_is_the_different_for_same_key_and_actions( + final String testKey, final EventKeyFactory.EventAction eventAction, + final String testKeyOther, final EventKeyFactory.EventAction eventActionOther) { + final JacksonEventKey objectUnderTest = new JacksonEventKey(testKey, eventAction); + final JacksonEventKey other = new JacksonEventKey(testKeyOther, eventActionOther); + + assertThat(objectUnderTest.hashCode(), not(equalTo(other.hashCode()))); + } + + @ParameterizedTest + @EnumSource(EventKeyFactory.EventAction.class) + void toString_returns_the_key(final EventKeyFactory.EventAction eventAction) { + final String testKey = UUID.randomUUID().toString(); + final JacksonEventKey objectUnderTest = new JacksonEventKey(testKey, eventAction); + + assertThat(objectUnderTest.toString(), equalTo(testKey)); + } +} \ No newline at end of file diff --git a/data-prepper-api/src/test/java/org/opensearch/dataprepper/model/event/JacksonEventTest.java b/data-prepper-api/src/test/java/org/opensearch/dataprepper/model/event/JacksonEventTest.java index 1a7efb7467..90645d2961 100644 --- a/data-prepper-api/src/test/java/org/opensearch/dataprepper/model/event/JacksonEventTest.java +++ b/data-prepper-api/src/test/java/org/opensearch/dataprepper/model/event/JacksonEventTest.java @@ -74,6 +74,53 @@ public void testPutAndGet_withRandomString() { assertThat(result, is(equalTo(value))); } + @Test + public void testPutAndGet_withRandomString_eventKey() { + final EventKey key = new JacksonEventKey("aRandomKey" + UUID.randomUUID()); + final UUID value = UUID.randomUUID(); + + event.put(key, value); + final UUID result = event.get(key, UUID.class); + + assertThat(result, is(notNullValue())); + assertThat(result, is(equalTo(value))); + } + + @Test + public void testPutAndGet_withRandomString_eventKey_multiple_events() { + final EventKey key = new JacksonEventKey("aRandomKey" + UUID.randomUUID()); + final UUID value = UUID.randomUUID(); + + for(int i = 0; i < 10; i++) { + event = JacksonEvent.builder() + .withEventType(eventType) + .build(); + + event.put(key, value); + final UUID result = event.get(key, UUID.class); + + assertThat(result, is(notNullValue())); + assertThat(result, is(equalTo(value))); + } + } + + @Test + public void testPutAndGet_eventKey_with_non_JacksonEventKey_throws() { + final EventKey key = mock(EventKey.class); + final UUID value = UUID.randomUUID(); + + assertThrows(IllegalArgumentException.class, () -> event.put(key, value)); + assertThrows(IllegalArgumentException.class, () -> event.get(key, UUID.class)); + } + + @Test + public void testPut_eventKey_with_immutable_action() { + final EventKey key = new JacksonEventKey("aRandomKey" + UUID.randomUUID(), EventKeyFactory.EventAction.GET); + final UUID value = UUID.randomUUID(); + + assertThrows(IllegalArgumentException.class, () -> event.put(key, value)); + } + @ParameterizedTest @ValueSource(strings = {"/", "foo", "foo-bar", "foo_bar", "foo.bar", "/foo", "/foo/", "a1K.k3-01_02", "keyWithBrackets[]"}) void testPutAndGet_withStrings(final String key) { @@ -86,6 +133,19 @@ void testPutAndGet_withStrings(final String key) { assertThat(result, is(equalTo(value))); } + @ParameterizedTest + @ValueSource(strings = {"/", "foo", "foo-bar", "foo_bar", "foo.bar", "/foo", "/foo/", "a1K.k3-01_02", "keyWithBrackets[]"}) + void testPutAndGet_withStrings_eventKey(final String key) { + final UUID value = UUID.randomUUID(); + + final EventKey eventKey = new JacksonEventKey(key); + event.put(eventKey, value); + final UUID result = event.get(eventKey, UUID.class); + + assertThat(result, is(notNullValue())); + assertThat(result, is(equalTo(value))); + } + @Test public void testPutKeyCannotBeEmptyString() { Throwable exception = assertThrows(IllegalArgumentException.class, () -> event.put("", "value")); @@ -93,7 +153,7 @@ public void testPutKeyCannotBeEmptyString() { } @Test - public void testPutAndGet_withMultLevelKey() { + public void testPutAndGet_withMultiLevelKey() { final String key = "foo/bar"; final UUID value = UUID.randomUUID(); @@ -104,6 +164,18 @@ public void testPutAndGet_withMultLevelKey() { assertThat(result, is(equalTo(value))); } + @Test + public void testPutAndGet_withMultiLevelKey_eventKey() { + final EventKey key = new JacksonEventKey("foo/bar"); + final UUID value = UUID.randomUUID(); + + event.put(key, value); + final UUID result = event.get(key, UUID.class); + + assertThat(result, is(notNullValue())); + assertThat(result, is(equalTo(value))); + } + @Test public void testPutAndGet_withMultiLevelKeyTwice() { final String key = "foo/bar"; @@ -125,6 +197,27 @@ public void testPutAndGet_withMultiLevelKeyTwice() { assertThat(result2, is(equalTo(value2))); } + @Test + public void testPutAndGet_withMultiLevelKeyTwice_eventKey() { + final EventKey key = new JacksonEventKey("foo/bar"); + final UUID value = UUID.randomUUID(); + + event.put(key, value); + final UUID result = event.get(key, UUID.class); + + assertThat(result, is(notNullValue())); + assertThat(result, is(equalTo(value))); + + final EventKey key2 = new JacksonEventKey("foo/fizz"); + final UUID value2 = UUID.randomUUID(); + + event.put(key2, value2); + final UUID result2 = event.get(key2, UUID.class); + + assertThat(result2, is(notNullValue())); + assertThat(result2, is(equalTo(value2))); + } + @Test public void testPutAndGet_withMultiLevelKeyWithADash() { final String key = "foo/bar-bar"; @@ -137,6 +230,18 @@ public void testPutAndGet_withMultiLevelKeyWithADash() { assertThat(result, is(equalTo(value))); } + @Test + public void testPutAndGet_withMultiLevelKeyWithADash_eventKey() { + final EventKey key = new JacksonEventKey("foo/bar-bar"); + final UUID value = UUID.randomUUID(); + + event.put(key, value); + final UUID result = event.get(key, UUID.class); + + assertThat(result, is(notNullValue())); + assertThat(result, is(equalTo(value))); + } + @ParameterizedTest @ValueSource(strings = {"foo", "/foo", "/foo/", "foo/"}) void testGetAtRootLevel(final String key) { @@ -148,6 +253,17 @@ void testGetAtRootLevel(final String key) { assertThat(result, is(Map.of("foo", value))); } + @ParameterizedTest + @ValueSource(strings = {"foo", "/foo", "/foo/", "foo/"}) + void testGetAtRootLevel_eventKey(final String key) { + final String value = UUID.randomUUID().toString(); + + event.put(new JacksonEventKey(key), value); + final Map result = event.get(new JacksonEventKey("", EventKeyFactory.EventAction.GET), Map.class); + + assertThat(result, is(Map.of("foo", value))); + } + @ParameterizedTest @ValueSource(strings = {"/foo/bar", "foo/bar", "foo/bar/"}) void testGetAtRootLevelWithMultiLevelKey(final String key) { @@ -159,6 +275,17 @@ void testGetAtRootLevelWithMultiLevelKey(final String key) { assertThat(result, is(Map.of("foo", Map.of("bar", value)))); } + @ParameterizedTest + @ValueSource(strings = {"/foo/bar", "foo/bar", "foo/bar/"}) + void testGetAtRootLevelWithMultiLevelKey_eventKey(final String key) { + final String value = UUID.randomUUID().toString(); + + event.put(new JacksonEventKey(key), value); + final Map result = event.get( new JacksonEventKey("", EventKeyFactory.EventAction.GET), Map.class); + + assertThat(result, is(Map.of("foo", Map.of("bar", value)))); + } + @Test public void testPutUpdateAndGet_withPojo() { final String key = "foo/bar"; @@ -293,6 +420,14 @@ public void testDeleteKey(final String key) { assertThat(result, is(nullValue())); } + @Test + public void testDelete_eventKey_with_immutable_action() { + final EventKey key = new JacksonEventKey("aRandomKey" + UUID.randomUUID(), EventKeyFactory.EventAction.GET); + final UUID value = UUID.randomUUID(); + + assertThrows(IllegalArgumentException.class, () -> event.delete(key)); + } + @Test public void testClear() { event.put("key1", UUID.randomUUID()); diff --git a/data-prepper-api/src/test/java/org/opensearch/dataprepper/model/event/JacksonEvent_JavaSerializationTest.java b/data-prepper-api/src/test/java/org/opensearch/dataprepper/model/event/JacksonEvent_JavaSerializationTest.java index b3ee46b55c..160f08d673 100644 --- a/data-prepper-api/src/test/java/org/opensearch/dataprepper/model/event/JacksonEvent_JavaSerializationTest.java +++ b/data-prepper-api/src/test/java/org/opensearch/dataprepper/model/event/JacksonEvent_JavaSerializationTest.java @@ -8,6 +8,7 @@ import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Test; import org.opensearch.dataprepper.model.acknowledgements.AcknowledgementSet; +import static org.junit.jupiter.api.Assertions.assertFalse; import java.io.ByteArrayInputStream; import java.io.ByteArrayOutputStream; @@ -20,7 +21,6 @@ import static org.hamcrest.MatcherAssert.assertThat; import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.instanceOf; -import static org.hamcrest.Matchers.nullValue; import static org.mockito.Mockito.mock; class JacksonEvent_JavaSerializationTest { @@ -54,7 +54,7 @@ void serialize_without_acknowledgementSet_includes_data() throws IOException, Cl assertThat(deserializedEvent.getMetadata(), equalTo(objectUnderTest.getMetadata())); assertThat(deserializedEvent.getEventHandle(), instanceOf(InternalEventHandle.class)); - assertThat(((InternalEventHandle) deserializedEvent.getEventHandle()).getAcknowledgementSet(), nullValue()); + assertFalse(((InternalEventHandle) deserializedEvent.getEventHandle()).hasAcknowledgementSet()); assertThat(deserializedEvent.getEventHandle().getInternalOriginationTime(), equalTo(objectUnderTest.getMetadata().getTimeReceived())); } @@ -63,7 +63,7 @@ void serialize_without_acknowledgementSet_includes_data() throws IOException, Cl void serialize_with_acknowledgementSet_does_not_include_old_acknowledgement_set() throws IOException, ClassNotFoundException { final JacksonEvent objectUnderTest = createObjectUnderTest(); final InternalEventHandle internalEventHandle = (InternalEventHandle) objectUnderTest.getEventHandle(); - internalEventHandle.setAcknowledgementSet(mock(AcknowledgementSet.class)); + internalEventHandle.addAcknowledgementSet(mock(AcknowledgementSet.class)); final Object deserializedObject = serializeAndDeserialize(objectUnderTest); @@ -74,7 +74,7 @@ void serialize_with_acknowledgementSet_does_not_include_old_acknowledgement_set( assertThat(deserializedEvent.getMetadata(), equalTo(objectUnderTest.getMetadata())); assertThat(deserializedEvent.getEventHandle(), instanceOf(InternalEventHandle.class)); - assertThat(((InternalEventHandle) deserializedEvent.getEventHandle()).getAcknowledgementSet(), nullValue()); + assertFalse(((InternalEventHandle) deserializedEvent.getEventHandle()).hasAcknowledgementSet()); assertThat(deserializedEvent.getEventHandle().getInternalOriginationTime(), equalTo(objectUnderTest.getMetadata().getTimeReceived())); } @@ -84,4 +84,4 @@ private Object serializeAndDeserialize(final JacksonEvent objectUnderTest) throw return objectInputStream.readObject(); } -} \ No newline at end of file +} diff --git a/data-prepper-api/src/test/java/org/opensearch/dataprepper/model/sink/AbstractSinkTest.java b/data-prepper-api/src/test/java/org/opensearch/dataprepper/model/sink/AbstractSinkTest.java index 3b9fe7c007..8d1af7ea44 100644 --- a/data-prepper-api/src/test/java/org/opensearch/dataprepper/model/sink/AbstractSinkTest.java +++ b/data-prepper-api/src/test/java/org/opensearch/dataprepper/model/sink/AbstractSinkTest.java @@ -11,15 +11,10 @@ import org.opensearch.dataprepper.metrics.MetricNames; import org.opensearch.dataprepper.metrics.MetricsTestUtil; import org.opensearch.dataprepper.model.configuration.PluginSetting; -import org.opensearch.dataprepper.model.record.Record; import org.opensearch.dataprepper.model.event.Event; -import org.opensearch.dataprepper.model.event.JacksonEvent; import org.opensearch.dataprepper.model.event.EventHandle; - -import static org.junit.jupiter.api.Assertions.assertEquals; -import static org.junit.jupiter.api.Assertions.assertTrue; -import static org.mockito.Mockito.when; -import static org.mockito.Mockito.mock; +import org.opensearch.dataprepper.model.event.JacksonEvent; +import org.opensearch.dataprepper.model.record.Record; import java.time.Duration; import java.util.Arrays; @@ -30,6 +25,12 @@ import java.util.UUID; import static org.awaitility.Awaitility.await; +import static org.hamcrest.CoreMatchers.equalTo; +import static org.hamcrest.MatcherAssert.assertThat; +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertTrue; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.when; class AbstractSinkTest { private int count; @@ -71,13 +72,13 @@ void testMetrics() { } @Test - void testSinkNotReady() { + void testSinkNotReady() throws InterruptedException { final String sinkName = "testSink"; final String pipelineName = "pipelineName"; MetricsTestUtil.initMetrics(); PluginSetting pluginSetting = new PluginSetting(sinkName, Collections.emptyMap()); pluginSetting.setPipelineName(pipelineName); - AbstractSink> abstractSink = new AbstractSinkNotReadyImpl(pluginSetting); + AbstractSinkNotReadyImpl abstractSink = new AbstractSinkNotReadyImpl(pluginSetting); abstractSink.initialize(); assertEquals(abstractSink.isReady(), false); assertEquals(abstractSink.getRetryThreadState(), Thread.State.RUNNABLE); @@ -87,7 +88,10 @@ void testSinkNotReady() { await().atMost(Duration.ofSeconds(5)) .until(abstractSink::isReady); assertEquals(abstractSink.getRetryThreadState(), Thread.State.TERMINATED); + int initCountBeforeShutdown = abstractSink.initCount; abstractSink.shutdown(); + Thread.sleep(200); + assertThat(abstractSink.initCount, equalTo(initCountBeforeShutdown)); } @Test diff --git a/data-prepper-core/build.gradle b/data-prepper-core/build.gradle index 429e07069c..c939129a1c 100644 --- a/data-prepper-core/build.gradle +++ b/data-prepper-core/build.gradle @@ -48,7 +48,6 @@ dependencies { exclude group: 'commons-logging', module: 'commons-logging' } implementation 'software.amazon.cloudwatchlogs:aws-embedded-metrics:2.0.0-beta-1' - testImplementation 'org.apache.logging.log4j:log4j-jpl:2.23.0' testImplementation testLibs.spring.test implementation libs.armeria.core implementation libs.armeria.grpc @@ -60,7 +59,6 @@ dependencies { implementation 'software.amazon.awssdk:servicediscovery' implementation 'com.fasterxml.jackson.datatype:jackson-datatype-jsr310' testImplementation testLibs.junit.vintage - testImplementation testLibs.mockito.inline testImplementation libs.commons.lang3 testImplementation project(':data-prepper-test-event') testImplementation project(':data-prepper-test-common') @@ -90,8 +88,6 @@ task integrationTest(type: Test) { classpath = sourceSets.integrationTest.runtimeClasspath - systemProperty 'log4j.configurationFile', 'src/test/resources/log4j2.properties' - filter { includeTestsMatching '*IT' } diff --git a/data-prepper-core/src/integrationTest/java/org/opensearch/dataprepper/integration/ProcessorPipelineIT.java b/data-prepper-core/src/integrationTest/java/org/opensearch/dataprepper/integration/ProcessorPipelineIT.java new file mode 100644 index 0000000000..8673fd9f21 --- /dev/null +++ b/data-prepper-core/src/integrationTest/java/org/opensearch/dataprepper/integration/ProcessorPipelineIT.java @@ -0,0 +1,121 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.integration; + +import org.junit.jupiter.api.AfterEach; +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Test; +import org.opensearch.dataprepper.model.event.Event; +import org.opensearch.dataprepper.model.event.JacksonEvent; +import org.opensearch.dataprepper.model.record.Record; +import org.opensearch.dataprepper.plugins.InMemorySinkAccessor; +import org.opensearch.dataprepper.plugins.InMemorySourceAccessor; +import org.opensearch.dataprepper.test.framework.DataPrepperTestRunner; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import java.util.Collections; +import java.util.List; +import java.util.UUID; +import java.util.concurrent.TimeUnit; +import java.util.stream.Collectors; +import java.util.stream.IntStream; + +import static org.awaitility.Awaitility.await; +import static org.hamcrest.CoreMatchers.equalTo; +import static org.hamcrest.CoreMatchers.not; +import static org.hamcrest.CoreMatchers.notNullValue; +import static org.hamcrest.MatcherAssert.assertThat; +import static org.hamcrest.Matchers.empty; + +class ProcessorPipelineIT { + private static final Logger LOG = LoggerFactory.getLogger(ProcessorPipelineIT.class); + private static final String IN_MEMORY_IDENTIFIER = "ProcessorPipelineIT"; + private static final String PIPELINE_CONFIGURATION_UNDER_TEST = "processor-pipeline.yaml"; + private DataPrepperTestRunner dataPrepperTestRunner; + private InMemorySourceAccessor inMemorySourceAccessor; + private InMemorySinkAccessor inMemorySinkAccessor; + + @BeforeEach + void setUp() { + dataPrepperTestRunner = DataPrepperTestRunner.builder() + .withPipelinesDirectoryOrFile(PIPELINE_CONFIGURATION_UNDER_TEST) + .build(); + + inMemorySourceAccessor = dataPrepperTestRunner.getInMemorySourceAccessor(); + inMemorySinkAccessor = dataPrepperTestRunner.getInMemorySinkAccessor(); + dataPrepperTestRunner.start(); + LOG.info("Started test runner."); + } + + @AfterEach + void tearDown() { + LOG.info("Test tear down. Stop the test runner."); + dataPrepperTestRunner.stop(); + } + + @Test + void run_with_single_record() { + final String messageValue = UUID.randomUUID().toString(); + final Event event = JacksonEvent.fromMessage(messageValue); + final Record eventRecord = new Record<>(event); + + LOG.info("Submitting a single record."); + inMemorySourceAccessor.submit(IN_MEMORY_IDENTIFIER, Collections.singletonList(eventRecord)); + + await().atMost(400, TimeUnit.MILLISECONDS) + .untilAsserted(() -> { + assertThat(inMemorySinkAccessor.get(IN_MEMORY_IDENTIFIER), not(empty())); + }); + + final List> records = inMemorySinkAccessor.get(IN_MEMORY_IDENTIFIER); + + assertThat(records.size(), equalTo(1)); + + assertThat(records.get(0), notNullValue()); + assertThat(records.get(0).getData(), notNullValue()); + assertThat(records.get(0).getData().get("message", String.class), equalTo(messageValue)); + assertThat(records.get(0).getData().get("test1", String.class), equalTo("knownPrefix10")); + assertThat(records.get(0).getData().get("test1_copy", String.class), equalTo("knownPrefix10")); + } + + @Test + void pipeline_with_single_batch_of_records() { + final int recordsToCreate = 200; + final List> inputRecords = IntStream.range(0, recordsToCreate) + .mapToObj(i -> UUID.randomUUID().toString()) + .map(JacksonEvent::fromMessage) + .map(Record::new) + .collect(Collectors.toList()); + + LOG.info("Submitting a batch of record."); + inMemorySourceAccessor.submit(IN_MEMORY_IDENTIFIER, inputRecords); + + await().atMost(400, TimeUnit.MILLISECONDS) + .untilAsserted(() -> { + assertThat(inMemorySinkAccessor.get(IN_MEMORY_IDENTIFIER), not(empty())); + }); + + assertThat(inMemorySinkAccessor.get(IN_MEMORY_IDENTIFIER).size(), equalTo(recordsToCreate)); + + final List> sinkRecords = inMemorySinkAccessor.get(IN_MEMORY_IDENTIFIER); + + for (int i = 0; i < sinkRecords.size(); i++) { + final Record inputRecord = inputRecords.get(i); + final Record sinkRecord = sinkRecords.get(i); + assertThat(sinkRecord, notNullValue()); + final Event recordData = sinkRecord.getData(); + assertThat(recordData, notNullValue()); + assertThat( + recordData.get("message", String.class), + equalTo(inputRecord.getData().get("message", String.class))); + assertThat(recordData.get("test1", String.class), + equalTo("knownPrefix1" + i)); + assertThat(recordData.get("test1_copy", String.class), + equalTo("knownPrefix1" + i)); + } + } +} diff --git a/data-prepper-core/src/integrationTest/java/org/opensearch/dataprepper/integration/Router_ThreeRoutesDefaultIT.java b/data-prepper-core/src/integrationTest/java/org/opensearch/dataprepper/integration/Router_ThreeRoutesDefaultIT.java new file mode 100644 index 0000000000..fbc61053a5 --- /dev/null +++ b/data-prepper-core/src/integrationTest/java/org/opensearch/dataprepper/integration/Router_ThreeRoutesDefaultIT.java @@ -0,0 +1,130 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.integration; + +import org.junit.jupiter.api.AfterEach; +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Test; +import org.opensearch.dataprepper.model.event.Event; +import org.opensearch.dataprepper.model.event.JacksonEvent; +import org.opensearch.dataprepper.model.record.Record; +import org.opensearch.dataprepper.plugins.InMemorySinkAccessor; +import org.opensearch.dataprepper.plugins.InMemorySourceAccessor; +import org.opensearch.dataprepper.test.framework.DataPrepperTestRunner; + +import java.util.ArrayList; +import java.util.Collections; +import java.util.List; +import java.util.Map; +import java.util.UUID; +import java.util.concurrent.TimeUnit; +import java.util.stream.Collectors; +import java.util.stream.IntStream; + +import static org.awaitility.Awaitility.await; +import static org.hamcrest.CoreMatchers.equalTo; +import static org.hamcrest.CoreMatchers.not; +import static org.hamcrest.MatcherAssert.assertThat; +import static org.hamcrest.Matchers.containsInAnyOrder; +import static org.hamcrest.Matchers.empty; + +class Router_ThreeRoutesDefaultIT { + private static final String TESTING_KEY = "ConditionalRoutingIT"; + private static final String ALL_SOURCE_KEY = TESTING_KEY + "_all"; + private static final String ALPHA_SOURCE_KEY = TESTING_KEY + "_alpha"; + private static final String BETA_SOURCE_KEY = TESTING_KEY + "_beta"; + private static final String ALPHA_DEFAULT_SOURCE_KEY = TESTING_KEY + "_alpha_default"; + private static final String ALPHA_BETA_GAMMA_SOURCE_KEY = TESTING_KEY + "_alpha_beta_gamma"; + private static final String DEFAULT_SOURCE_KEY = TESTING_KEY + "_default"; + private static final String KNOWN_CONDITIONAL_KEY = "value"; + private static final String ALPHA_VALUE = "a"; + private static final String BETA_VALUE = "b"; + private static final String GAMMA_VALUE = "g"; + private static final String DEFAULT_VALUE = "z"; + private DataPrepperTestRunner dataPrepperTestRunner; + private InMemorySourceAccessor inMemorySourceAccessor; + private InMemorySinkAccessor inMemorySinkAccessor; + + @BeforeEach + void setUp() { + dataPrepperTestRunner = DataPrepperTestRunner.builder() + .withPipelinesDirectoryOrFile("route/three-route-with-default-route.yaml") + .build(); + + dataPrepperTestRunner.start(); + inMemorySourceAccessor = dataPrepperTestRunner.getInMemorySourceAccessor(); + inMemorySinkAccessor = dataPrepperTestRunner.getInMemorySinkAccessor(); + } + + @AfterEach + void tearDown() { + dataPrepperTestRunner.stop(); + } + + @Test + void test_default_route() { + final List> alphaEvents = createEvents(ALPHA_VALUE, 10); + final List> betaEvents = createEvents(BETA_VALUE, 20); + final List> gammaEvents = createEvents(GAMMA_VALUE, 20); + final List> defaultEvents = createEvents(DEFAULT_VALUE, 20); + + final List> allEvents = new ArrayList<>(alphaEvents); + allEvents.addAll(betaEvents); + allEvents.addAll(gammaEvents); + allEvents.addAll(defaultEvents); + Collections.shuffle(allEvents); + + inMemorySourceAccessor.submit(TESTING_KEY, allEvents); + + await().atMost(2, TimeUnit.SECONDS) + .untilAsserted(() -> { + assertThat(inMemorySinkAccessor.get(ALPHA_SOURCE_KEY), not(empty())); + assertThat(inMemorySinkAccessor.get(BETA_SOURCE_KEY), not(empty())); + assertThat(inMemorySinkAccessor.get(ALL_SOURCE_KEY), not(empty())); + assertThat(inMemorySinkAccessor.get(ALPHA_DEFAULT_SOURCE_KEY), not(empty())); + assertThat(inMemorySinkAccessor.get(ALPHA_BETA_GAMMA_SOURCE_KEY), not(empty())); + assertThat(inMemorySinkAccessor.get(DEFAULT_SOURCE_KEY), not(empty())); + }); + + final List> actualAlphaRecords = inMemorySinkAccessor.get(ALPHA_SOURCE_KEY); + + assertThat(actualAlphaRecords.size(), equalTo(alphaEvents.size())); + + assertThat(actualAlphaRecords, containsInAnyOrder(allEvents.stream() + .filter(alphaEvents::contains).toArray())); + + final List> actualBetaRecords = inMemorySinkAccessor.get(BETA_SOURCE_KEY); + + assertThat(actualBetaRecords.size(), equalTo(betaEvents.size())); + + assertThat(actualBetaRecords, containsInAnyOrder(allEvents.stream() + .filter(betaEvents::contains).toArray())); + + final List> actualDefaultRecords = inMemorySinkAccessor.get(DEFAULT_SOURCE_KEY); + + assertThat(actualDefaultRecords.size(), equalTo(defaultEvents.size())); + assertThat(actualDefaultRecords, containsInAnyOrder(allEvents.stream() + .filter(defaultEvents::contains).toArray())); + + final List> actualAlphaDefaultRecords = new ArrayList<>(); + actualAlphaDefaultRecords.addAll(actualAlphaRecords); + actualAlphaDefaultRecords.addAll(actualDefaultRecords); + assertThat(actualAlphaDefaultRecords.size(), equalTo(defaultEvents.size()+alphaEvents.size())); + assertThat(actualAlphaDefaultRecords, containsInAnyOrder(allEvents.stream() + .filter(event -> defaultEvents.contains(event) || alphaEvents.contains(event)).toArray())); + + } + + private List> createEvents(final String value, final int numberToCreate) { + return IntStream.range(0, numberToCreate) + .mapToObj(i -> Map.of(KNOWN_CONDITIONAL_KEY, value, "arbitrary_field", UUID.randomUUID().toString())) + .map(map -> JacksonEvent.builder().withData(map).withEventType("TEST").build()) + .map(jacksonEvent -> (Event) jacksonEvent) + .map(Record::new) + .collect(Collectors.toList()); + } +} + diff --git a/data-prepper-core/src/integrationTest/java/org/opensearch/dataprepper/plugins/InMemorySink.java b/data-prepper-core/src/integrationTest/java/org/opensearch/dataprepper/plugins/InMemorySink.java index dec7aa5c1f..360367a1e4 100644 --- a/data-prepper-core/src/integrationTest/java/org/opensearch/dataprepper/plugins/InMemorySink.java +++ b/data-prepper-core/src/integrationTest/java/org/opensearch/dataprepper/plugins/InMemorySink.java @@ -40,7 +40,7 @@ public void output(final Collection> records) { records.stream().forEach((record) -> { EventHandle eventHandle = ((Event)record.getData()).getEventHandle(); if (acknowledgements) { - acknowledgementSetManager.releaseEventReference(eventHandle, result); + eventHandle.release(result); } }); } diff --git a/data-prepper-core/src/integrationTest/java/org/opensearch/dataprepper/plugins/InMemorySourceAccessor.java b/data-prepper-core/src/integrationTest/java/org/opensearch/dataprepper/plugins/InMemorySourceAccessor.java index 71151be22e..3957d259a9 100644 --- a/data-prepper-core/src/integrationTest/java/org/opensearch/dataprepper/plugins/InMemorySourceAccessor.java +++ b/data-prepper-core/src/integrationTest/java/org/opensearch/dataprepper/plugins/InMemorySourceAccessor.java @@ -6,20 +6,19 @@ package org.opensearch.dataprepper.plugins; import org.opensearch.dataprepper.model.event.Event; -import org.opensearch.dataprepper.model.event.JacksonEvent; -import org.opensearch.dataprepper.model.record.Record; -import org.opensearch.dataprepper.model.event.EventFactory; import org.opensearch.dataprepper.model.event.EventBuilder; +import org.opensearch.dataprepper.model.event.EventFactory; +import org.opensearch.dataprepper.model.record.Record; import java.util.ArrayList; import java.util.Collections; import java.util.HashMap; import java.util.List; -import java.util.UUID; import java.util.Map; +import java.util.UUID; +import java.util.concurrent.atomic.AtomicBoolean; import java.util.concurrent.locks.Lock; import java.util.concurrent.locks.ReentrantLock; -import java.util.concurrent.atomic.AtomicBoolean; /** * Provides a mechanism to write records to an in_memory source. This allows the pipeline to execute @@ -62,8 +61,8 @@ public void submit(final String testingKey, int numRecords) { for (int i = 0; i < numRecords; i++) { Map eventMap = Map.of("message", UUID.randomUUID().toString()); EventBuilder eventBuilder = (EventBuilder) eventFactory.eventBuilder(EventBuilder.class).withData(eventMap); - JacksonEvent event = (JacksonEvent) eventBuilder.build(); - records.add(new Record(event)); + Event event = eventBuilder.build(); + records.add(new Record<>(event)); } submit(testingKey, records); } @@ -79,8 +78,8 @@ public void submitWithStatus(final String testingKey, int numRecords) { int status = (int)(Math.random() * (max - min + 1) + min); Map eventMap = Map.of("message", UUID.randomUUID().toString(), "status", status); EventBuilder eventBuilder = (EventBuilder) eventFactory.eventBuilder(EventBuilder.class).withData(eventMap); - JacksonEvent event = (JacksonEvent) eventBuilder.build(); - records.add(new Record(event)); + Event event = eventBuilder.build(); + records.add(new Record<>(event)); } submit(testingKey, records); } diff --git a/data-prepper-core/src/integrationTest/java/org/opensearch/dataprepper/plugins/SimpleCopyProcessor.java b/data-prepper-core/src/integrationTest/java/org/opensearch/dataprepper/plugins/SimpleCopyProcessor.java new file mode 100644 index 0000000000..a786f09128 --- /dev/null +++ b/data-prepper-core/src/integrationTest/java/org/opensearch/dataprepper/plugins/SimpleCopyProcessor.java @@ -0,0 +1,51 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins; + +import org.opensearch.dataprepper.model.annotations.DataPrepperPlugin; +import org.opensearch.dataprepper.model.annotations.DataPrepperPluginConstructor; +import org.opensearch.dataprepper.model.event.Event; +import org.opensearch.dataprepper.model.processor.Processor; +import org.opensearch.dataprepper.model.record.Record; + +import java.util.Collection; + +@DataPrepperPlugin(name = "simple_copy_test", pluginType = Processor.class, pluginConfigurationType = SimpleCopyProcessorConfig.class) +public class SimpleCopyProcessor implements Processor, Record> { + private final SimpleCopyProcessorConfig simpleCopyProcessorConfig; + int count = 0; + + @DataPrepperPluginConstructor + public SimpleCopyProcessor(final SimpleCopyProcessorConfig simpleCopyProcessorConfig) { + this.simpleCopyProcessorConfig = simpleCopyProcessorConfig; + } + + @Override + public Collection> execute(final Collection> records) { + for (final Record record : records) { + final Object value = record.getData().get(simpleCopyProcessorConfig.getSource(), Object.class); + record.getData().put(simpleCopyProcessorConfig.getTarget(), value); + count++; + } + + return records; + } + + @Override + public void prepareForShutdown() { + + } + + @Override + public boolean isReadyForShutdown() { + return false; + } + + @Override + public void shutdown() { + + } +} diff --git a/data-prepper-core/src/integrationTest/java/org/opensearch/dataprepper/plugins/SimpleCopyProcessorConfig.java b/data-prepper-core/src/integrationTest/java/org/opensearch/dataprepper/plugins/SimpleCopyProcessorConfig.java new file mode 100644 index 0000000000..ded7f6212f --- /dev/null +++ b/data-prepper-core/src/integrationTest/java/org/opensearch/dataprepper/plugins/SimpleCopyProcessorConfig.java @@ -0,0 +1,24 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins; + +import org.opensearch.dataprepper.model.event.EventKey; +import org.opensearch.dataprepper.model.event.EventKeyConfiguration; +import org.opensearch.dataprepper.model.event.EventKeyFactory; + +public class SimpleCopyProcessorConfig { + @EventKeyConfiguration(EventKeyFactory.EventAction.GET) + private EventKey source; + private EventKey target; + + public EventKey getSource() { + return source; + } + + public EventKey getTarget() { + return target; + } +} diff --git a/data-prepper-core/src/integrationTest/java/org/opensearch/dataprepper/plugins/SimpleProcessor.java b/data-prepper-core/src/integrationTest/java/org/opensearch/dataprepper/plugins/SimpleProcessor.java new file mode 100644 index 0000000000..b0450d06d1 --- /dev/null +++ b/data-prepper-core/src/integrationTest/java/org/opensearch/dataprepper/plugins/SimpleProcessor.java @@ -0,0 +1,53 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins; + +import org.opensearch.dataprepper.model.annotations.DataPrepperPlugin; +import org.opensearch.dataprepper.model.annotations.DataPrepperPluginConstructor; +import org.opensearch.dataprepper.model.event.Event; +import org.opensearch.dataprepper.model.event.EventKey; +import org.opensearch.dataprepper.model.processor.Processor; +import org.opensearch.dataprepper.model.record.Record; + +import java.util.Collection; + +@DataPrepperPlugin(name = "simple_test", pluginType = Processor.class, pluginConfigurationType = SimpleProcessorConfig.class) +public class SimpleProcessor implements Processor, Record> { + private final EventKey eventKey1; + private final String valuePrefix1; + int count = 0; + + @DataPrepperPluginConstructor + public SimpleProcessor(final SimpleProcessorConfig simpleProcessorConfig) { + eventKey1 = simpleProcessorConfig.getKey1(); + valuePrefix1 = simpleProcessorConfig.getValuePrefix1(); + } + + @Override + public Collection> execute(final Collection> records) { + for (final Record record : records) { + record.getData().put(eventKey1, valuePrefix1 + count); + count++; + } + + return records; + } + + @Override + public void prepareForShutdown() { + + } + + @Override + public boolean isReadyForShutdown() { + return false; + } + + @Override + public void shutdown() { + + } +} diff --git a/data-prepper-core/src/integrationTest/java/org/opensearch/dataprepper/plugins/SimpleProcessorConfig.java b/data-prepper-core/src/integrationTest/java/org/opensearch/dataprepper/plugins/SimpleProcessorConfig.java new file mode 100644 index 0000000000..932d91c936 --- /dev/null +++ b/data-prepper-core/src/integrationTest/java/org/opensearch/dataprepper/plugins/SimpleProcessorConfig.java @@ -0,0 +1,24 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins; + +import org.opensearch.dataprepper.model.event.EventKey; +import org.opensearch.dataprepper.model.event.EventKeyConfiguration; +import org.opensearch.dataprepper.model.event.EventKeyFactory; + +public class SimpleProcessorConfig { + @EventKeyConfiguration(EventKeyFactory.EventAction.PUT) + private EventKey key1; + private String valuePrefix1; + + public EventKey getKey1() { + return key1; + } + + public String getValuePrefix1() { + return valuePrefix1; + } +} diff --git a/data-prepper-core/src/integrationTest/resources/org/opensearch/dataprepper/pipeline/processor-pipeline.yaml b/data-prepper-core/src/integrationTest/resources/org/opensearch/dataprepper/pipeline/processor-pipeline.yaml new file mode 100644 index 0000000000..be0e18a283 --- /dev/null +++ b/data-prepper-core/src/integrationTest/resources/org/opensearch/dataprepper/pipeline/processor-pipeline.yaml @@ -0,0 +1,17 @@ +processor-pipeline: + delay: 10 + source: + in_memory: + testing_key: ProcessorPipelineIT + + processor: + - simple_test: + key1: /test1 + value_prefix1: knownPrefix1 + - simple_copy_test: + source: /test1 + target: /test1_copy + + sink: + - in_memory: + testing_key: ProcessorPipelineIT diff --git a/data-prepper-core/src/integrationTest/resources/org/opensearch/dataprepper/pipeline/route/three-route-with-default-route.yaml b/data-prepper-core/src/integrationTest/resources/org/opensearch/dataprepper/pipeline/route/three-route-with-default-route.yaml new file mode 100644 index 0000000000..6d608a0d0b --- /dev/null +++ b/data-prepper-core/src/integrationTest/resources/org/opensearch/dataprepper/pipeline/route/three-route-with-default-route.yaml @@ -0,0 +1,41 @@ +routing-pipeline: + workers: 4 + delay: 10 + source: + in_memory: + testing_key: ConditionalRoutingIT + buffer: + bounded_blocking: + # Use a small batch size to help ensure that multiple threads + # are picking up the different routes. + batch_size: 10 + route: + - alpha: '/value == "a"' + - beta: '/value == "b"' + - gamma: '/value == "g"' + sink: + - in_memory: + testing_key: ConditionalRoutingIT_alpha + routes: + - alpha + - in_memory: + testing_key: ConditionalRoutingIT_beta + routes: + - beta + - in_memory: + testing_key: ConditionalRoutingIT_alpha_default + routes: + - alpha + - _default + - in_memory: + testing_key: ConditionalRoutingIT_alpha_beta_gamma + routes: + - alpha + - beta + - gamma + - in_memory: + testing_key: ConditionalRoutingIT_default + routes: + - _default + - in_memory: + testing_key: ConditionalRoutingIT_all diff --git a/data-prepper-core/src/main/java/org/opensearch/dataprepper/acknowledgements/AcknowledgementSetMonitor.java b/data-prepper-core/src/main/java/org/opensearch/dataprepper/acknowledgements/AcknowledgementSetMonitor.java index af9860cc9a..8c911346db 100644 --- a/data-prepper-core/src/main/java/org/opensearch/dataprepper/acknowledgements/AcknowledgementSetMonitor.java +++ b/data-prepper-core/src/main/java/org/opensearch/dataprepper/acknowledgements/AcknowledgementSetMonitor.java @@ -5,9 +5,6 @@ package org.opensearch.dataprepper.acknowledgements; -import org.opensearch.dataprepper.model.event.EventHandle; -import org.opensearch.dataprepper.model.event.DefaultEventHandle; -import org.opensearch.dataprepper.model.event.InternalEventHandle; import org.opensearch.dataprepper.model.acknowledgements.AcknowledgementSet; import java.util.concurrent.locks.ReentrantLock; @@ -33,15 +30,6 @@ class AcknowledgementSetMonitor implements Runnable { private final AtomicInteger numInvalidReleases; private final AtomicInteger numNullHandles; - private DefaultAcknowledgementSet getAcknowledgementSet(final EventHandle eventHandle) { - if (eventHandle instanceof DefaultEventHandle) { - InternalEventHandle internalEventHandle = (InternalEventHandle)(DefaultEventHandle)eventHandle; - return (DefaultAcknowledgementSet)internalEventHandle.getAcknowledgementSet(); - } else { - throw new RuntimeException("Unsupported event handle"); - } - } - public AcknowledgementSetMonitor() { this.acknowledgementSets = new HashSet<>(); this.lock = new ReentrantLock(true); @@ -67,55 +55,6 @@ public void add(final AcknowledgementSet acknowledgementSet) { } } - public void acquire(final EventHandle eventHandle) { - if (eventHandle == null) { - numNullHandles.incrementAndGet(); - return; - } - - DefaultAcknowledgementSet acknowledgementSet = getAcknowledgementSet(eventHandle); - lock.lock(); - boolean exists = false; - try { - exists = acknowledgementSets.contains(acknowledgementSet); - } finally { - lock.unlock(); - } - // if acknowledgementSet doesn't exist then it means that the - // event still active even after the acknowledgement set is - // cleaned up. - if (exists) { - acknowledgementSet.acquire(eventHandle); - } else { - LOG.warn("Trying acquire an event in an AcknowledgementSet that does not exist"); - numInvalidAcquires.incrementAndGet(); - } - } - - public void release(final EventHandle eventHandle, final boolean success) { - if (eventHandle == null) { - numNullHandles.incrementAndGet(); - return; - } - DefaultAcknowledgementSet acknowledgementSet = getAcknowledgementSet(eventHandle); - lock.lock(); - boolean exists = false; - try { - exists = acknowledgementSets.contains(acknowledgementSet); - } finally { - lock.unlock(); - } - // if acknowledgementSet doesn't exist then it means some late - // arrival of event handle release after the acknowledgement set - // is cleaned up. - if (exists) { - boolean b = acknowledgementSet.release(eventHandle, success); - } else { - LOG.warn("Trying to release from an AcknowledgementSet that does not exist"); - numInvalidReleases.incrementAndGet(); - } - } - /** * for testing * @return the size @@ -131,6 +70,8 @@ public void run() { if (acknowledgementSets.size() > 0) { acknowledgementSets.removeIf((ackSet) -> ((DefaultAcknowledgementSet) ackSet).isDone()); } + Thread.sleep(1000); + } catch (InterruptedException e) { } finally { lock.unlock(); } diff --git a/data-prepper-core/src/main/java/org/opensearch/dataprepper/acknowledgements/DefaultAcknowledgementSet.java b/data-prepper-core/src/main/java/org/opensearch/dataprepper/acknowledgements/DefaultAcknowledgementSet.java index c2823203fe..fd26d10c72 100644 --- a/data-prepper-core/src/main/java/org/opensearch/dataprepper/acknowledgements/DefaultAcknowledgementSet.java +++ b/data-prepper-core/src/main/java/org/opensearch/dataprepper/acknowledgements/DefaultAcknowledgementSet.java @@ -82,7 +82,7 @@ public void add(Event event) { EventHandle eventHandle = event.getEventHandle(); if (eventHandle instanceof DefaultEventHandle) { InternalEventHandle internalEventHandle = (InternalEventHandle)(DefaultEventHandle)eventHandle; - internalEventHandle.setAcknowledgementSet(this); + internalEventHandle.addAcknowledgementSet(this); pendingAcknowledgments.put(eventHandle, new AtomicInteger(1)); totalEventsAdded.incrementAndGet(); } diff --git a/data-prepper-core/src/main/java/org/opensearch/dataprepper/acknowledgements/DefaultAcknowledgementSetManager.java b/data-prepper-core/src/main/java/org/opensearch/dataprepper/acknowledgements/DefaultAcknowledgementSetManager.java index 3f2e3761bd..b8f81dbfc1 100644 --- a/data-prepper-core/src/main/java/org/opensearch/dataprepper/acknowledgements/DefaultAcknowledgementSetManager.java +++ b/data-prepper-core/src/main/java/org/opensearch/dataprepper/acknowledgements/DefaultAcknowledgementSetManager.java @@ -7,8 +7,6 @@ import org.opensearch.dataprepper.model.acknowledgements.AcknowledgementSet; import org.opensearch.dataprepper.model.acknowledgements.AcknowledgementSetManager; -import org.opensearch.dataprepper.model.event.Event; -import org.opensearch.dataprepper.model.event.EventHandle; import org.opensearch.dataprepper.metrics.PluginMetrics; import javax.inject.Inject; @@ -49,18 +47,6 @@ public AcknowledgementSet create(final Consumer callback, final Duratio return acknowledgementSet; } - public void acquireEventReference(final Event event) { - acquireEventReference(event.getEventHandle()); - } - - public void acquireEventReference(final EventHandle eventHandle) { - acknowledgementSetMonitor.acquire(eventHandle); - } - - public void releaseEventReference(final EventHandle eventHandle, final boolean success) { - acknowledgementSetMonitor.release(eventHandle, success); - } - public void shutdown() { acknowledgementSetMonitorThread.stop(); } diff --git a/data-prepper-core/src/main/java/org/opensearch/dataprepper/acknowledgements/InactiveAcknowledgementSetManager.java b/data-prepper-core/src/main/java/org/opensearch/dataprepper/acknowledgements/InactiveAcknowledgementSetManager.java index 2e112b4560..52f0e1978f 100644 --- a/data-prepper-core/src/main/java/org/opensearch/dataprepper/acknowledgements/InactiveAcknowledgementSetManager.java +++ b/data-prepper-core/src/main/java/org/opensearch/dataprepper/acknowledgements/InactiveAcknowledgementSetManager.java @@ -5,8 +5,6 @@ package org.opensearch.dataprepper.acknowledgements; -import org.opensearch.dataprepper.model.event.Event; -import org.opensearch.dataprepper.model.event.EventHandle; import org.opensearch.dataprepper.model.acknowledgements.AcknowledgementSetManager; import org.opensearch.dataprepper.model.acknowledgements.AcknowledgementSet; import java.util.function.Consumer; @@ -26,15 +24,4 @@ public AcknowledgementSet create(final Consumer callback, final Duratio throw new UnsupportedOperationException("create operation not supported"); } - public void acquireEventReference(final Event event) { - throw new UnsupportedOperationException("acquire operation not supported"); - } - - public void acquireEventReference(final EventHandle eventHandle) { - throw new UnsupportedOperationException("acquire operation not supported"); - } - - public void releaseEventReference(final EventHandle eventHandle, boolean success) { - throw new UnsupportedOperationException("release operation not supported"); - } } diff --git a/data-prepper-core/src/main/java/org/opensearch/dataprepper/peerforwarder/DefaultPeerForwarderProvider.java b/data-prepper-core/src/main/java/org/opensearch/dataprepper/peerforwarder/DefaultPeerForwarderProvider.java new file mode 100644 index 0000000000..ff638ee26f --- /dev/null +++ b/data-prepper-core/src/main/java/org/opensearch/dataprepper/peerforwarder/DefaultPeerForwarderProvider.java @@ -0,0 +1,102 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.peerforwarder; + +import org.opensearch.dataprepper.metrics.PluginMetrics; +import org.opensearch.dataprepper.model.event.Event; +import org.opensearch.dataprepper.model.record.Record; +import org.opensearch.dataprepper.model.processor.Processor; +import org.opensearch.dataprepper.peerforwarder.client.PeerForwarderClient; +import org.opensearch.dataprepper.peerforwarder.discovery.DiscoveryMode; + +import java.util.HashMap; +import java.util.Map; +import java.util.Set; + +public class DefaultPeerForwarderProvider implements PeerForwarderProvider { + + private final PeerForwarderClientFactory peerForwarderClientFactory; + private final PeerForwarderClient peerForwarderClient; + private final PeerForwarderConfiguration peerForwarderConfiguration; + private final PluginMetrics pluginMetrics; + private final Map>>> pipelinePeerForwarderReceiveBufferMap = new HashMap<>(); + private HashRing hashRing; + + DefaultPeerForwarderProvider(final PeerForwarderClientFactory peerForwarderClientFactory, + final PeerForwarderClient peerForwarderClient, + final PeerForwarderConfiguration peerForwarderConfiguration, + final PluginMetrics pluginMetrics) { + this.peerForwarderClientFactory = peerForwarderClientFactory; + this.peerForwarderClient = peerForwarderClient; + this.peerForwarderConfiguration = peerForwarderConfiguration; + this.pluginMetrics = pluginMetrics; + } + + public PeerForwarder register(final String pipelineName, final Processor processor, final String pluginId, final Set identificationKeys, + final Integer pipelineWorkerThreads) { + if (pipelinePeerForwarderReceiveBufferMap.containsKey(pipelineName) && + pipelinePeerForwarderReceiveBufferMap.get(pipelineName).containsKey(pluginId)) { + throw new RuntimeException("Data Prepper 2.0 will only support a single peer-forwarder per pipeline/plugin type"); + } + + final PeerForwarderReceiveBuffer> peerForwarderReceiveBuffer = createBufferPerPipelineProcessor(pipelineName, pluginId); + + if (isPeerForwardingRequired()) { + if (hashRing == null) { + hashRing = peerForwarderClientFactory.createHashRing(); + } + return new RemotePeerForwarder( + peerForwarderClient, + hashRing, + peerForwarderReceiveBuffer, + pipelineName, + pluginId, + identificationKeys, + pluginMetrics, + peerForwarderConfiguration.getBatchDelay(), + peerForwarderConfiguration.getFailedForwardingRequestLocalWriteTimeout(), + peerForwarderConfiguration.getForwardingBatchSize(), + peerForwarderConfiguration.getForwardingBatchQueueDepth(), + peerForwarderConfiguration.getForwardingBatchTimeout(), + pipelineWorkerThreads + ); + } + else { + return new LocalPeerForwarder(); + } + } + + private PeerForwarderReceiveBuffer> createBufferPerPipelineProcessor(final String pipelineName, final String pluginId) { + final PeerForwarderReceiveBuffer> peerForwarderReceiveBuffer = new + PeerForwarderReceiveBuffer<>(peerForwarderConfiguration.getBufferSize(), peerForwarderConfiguration.getBatchSize(), pipelineName, pluginId); + + final Map>> pluginsBufferMap = + pipelinePeerForwarderReceiveBufferMap.computeIfAbsent(pipelineName, k -> new HashMap<>()); + + pluginsBufferMap.put(pluginId, peerForwarderReceiveBuffer); + + return peerForwarderReceiveBuffer; + } + + public boolean isPeerForwardingRequired() { + return arePeersConfigured() && pipelinePeerForwarderReceiveBufferMap.size() > 0; + } + + public boolean arePeersConfigured() { + final DiscoveryMode discoveryMode = peerForwarderConfiguration.getDiscoveryMode(); + if (discoveryMode.equals(DiscoveryMode.LOCAL_NODE)) { + return false; + } + else if (discoveryMode.equals(DiscoveryMode.STATIC) && peerForwarderConfiguration.getStaticEndpoints().size() <= 1) { + return false; + } + return true; + } + + public Map>>> getPipelinePeerForwarderReceiveBufferMap() { + return pipelinePeerForwarderReceiveBufferMap; + } +} diff --git a/data-prepper-core/src/main/java/org/opensearch/dataprepper/peerforwarder/LocalModePeerForwarderProvider.java b/data-prepper-core/src/main/java/org/opensearch/dataprepper/peerforwarder/LocalModePeerForwarderProvider.java new file mode 100644 index 0000000000..6c2c4fe688 --- /dev/null +++ b/data-prepper-core/src/main/java/org/opensearch/dataprepper/peerforwarder/LocalModePeerForwarderProvider.java @@ -0,0 +1,51 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.peerforwarder; + +import org.opensearch.dataprepper.model.event.Event; +import org.opensearch.dataprepper.model.record.Record; +import org.opensearch.dataprepper.model.processor.Processor; +import org.opensearch.dataprepper.model.peerforwarder.RequiresPeerForwarding; + +import java.util.Map; +import java.util.Set; + +public class LocalModePeerForwarderProvider implements PeerForwarderProvider { + + private final PeerForwarderProvider peerForwarderProvider; + private boolean isRemotePeerForwarderRegistered; + + public LocalModePeerForwarderProvider(final PeerForwarderProvider peerForwarderProvider) { + this.peerForwarderProvider = peerForwarderProvider; + this.isRemotePeerForwarderRegistered = false; + } + + @Override + public PeerForwarder register(final String pipelineName, final Processor processor, final String pluginId, final Set identificationKeys, final Integer pipelineWorkerThreads) { + if (((RequiresPeerForwarding)processor).isForLocalProcessingOnly(null)) { + return new LocalPeerForwarder(); + } + isRemotePeerForwarderRegistered = true; + return peerForwarderProvider.register(pipelineName, processor, pluginId, identificationKeys, pipelineWorkerThreads); + } + + @Override + public boolean isPeerForwardingRequired() { + return isRemotePeerForwarderRegistered; + } + + @Override + public Map>>> getPipelinePeerForwarderReceiveBufferMap() { + return (isRemotePeerForwarderRegistered) ? + peerForwarderProvider.getPipelinePeerForwarderReceiveBufferMap() : + Map.of(); + } + + @Override + public boolean arePeersConfigured() { + return isRemotePeerForwarderRegistered ? peerForwarderProvider.arePeersConfigured() : false; + } +} diff --git a/data-prepper-core/src/main/java/org/opensearch/dataprepper/peerforwarder/PeerForwarderAppConfig.java b/data-prepper-core/src/main/java/org/opensearch/dataprepper/peerforwarder/PeerForwarderAppConfig.java index e3123b67f1..4cca81819d 100644 --- a/data-prepper-core/src/main/java/org/opensearch/dataprepper/peerforwarder/PeerForwarderAppConfig.java +++ b/data-prepper-core/src/main/java/org/opensearch/dataprepper/peerforwarder/PeerForwarderAppConfig.java @@ -20,6 +20,10 @@ import org.springframework.beans.factory.annotation.Qualifier; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; +import org.springframework.context.annotation.Primary; + +import javax.inject.Named; + @Configuration class PeerForwarderAppConfig { @@ -71,12 +75,18 @@ public PeerForwarderClient peerForwarderClient(final PeerForwarderConfiguration peerForwarderConfiguration, peerForwarderClientFactory, peerForwarderCodec, pluginMetrics); } - @Bean - public PeerForwarderProvider peerForwarderProvider(final PeerForwarderClientFactory peerForwarderClientFactory, + @Bean(name = "defaultPeerForwarder") + public DefaultPeerForwarderProvider peerForwarderProvider(final PeerForwarderClientFactory peerForwarderClientFactory, final PeerForwarderClient peerForwarderClient, final PeerForwarderConfiguration peerForwarderConfiguration, @Qualifier("peerForwarderMetrics") final PluginMetrics pluginMetrics) { - return new PeerForwarderProvider(peerForwarderClientFactory, peerForwarderClient, peerForwarderConfiguration, pluginMetrics); + return new DefaultPeerForwarderProvider(peerForwarderClientFactory, peerForwarderClient, peerForwarderConfiguration, pluginMetrics); + } + + @Bean + @Primary + public PeerForwarderProvider peerForwarderProvider(@Named("defaultPeerForwarder") final PeerForwarderProvider peerForwarderProvider) { + return new LocalModePeerForwarderProvider(peerForwarderProvider); } @Bean diff --git a/data-prepper-core/src/main/java/org/opensearch/dataprepper/peerforwarder/PeerForwarderProvider.java b/data-prepper-core/src/main/java/org/opensearch/dataprepper/peerforwarder/PeerForwarderProvider.java index ea89c1cbdd..40b3a03deb 100644 --- a/data-prepper-core/src/main/java/org/opensearch/dataprepper/peerforwarder/PeerForwarderProvider.java +++ b/data-prepper-core/src/main/java/org/opensearch/dataprepper/peerforwarder/PeerForwarderProvider.java @@ -5,97 +5,49 @@ package org.opensearch.dataprepper.peerforwarder; -import org.opensearch.dataprepper.metrics.PluginMetrics; import org.opensearch.dataprepper.model.event.Event; import org.opensearch.dataprepper.model.record.Record; -import org.opensearch.dataprepper.peerforwarder.client.PeerForwarderClient; -import org.opensearch.dataprepper.peerforwarder.discovery.DiscoveryMode; +import org.opensearch.dataprepper.model.processor.Processor; -import java.util.HashMap; import java.util.Map; import java.util.Set; -public class PeerForwarderProvider { - - private final PeerForwarderClientFactory peerForwarderClientFactory; - private final PeerForwarderClient peerForwarderClient; - private final PeerForwarderConfiguration peerForwarderConfiguration; - private final PluginMetrics pluginMetrics; - private final Map>>> pipelinePeerForwarderReceiveBufferMap = new HashMap<>(); - private HashRing hashRing; - - PeerForwarderProvider(final PeerForwarderClientFactory peerForwarderClientFactory, - final PeerForwarderClient peerForwarderClient, - final PeerForwarderConfiguration peerForwarderConfiguration, - final PluginMetrics pluginMetrics) { - this.peerForwarderClientFactory = peerForwarderClientFactory; - this.peerForwarderClient = peerForwarderClient; - this.peerForwarderConfiguration = peerForwarderConfiguration; - this.pluginMetrics = pluginMetrics; - } - - public PeerForwarder register(final String pipelineName, final String pluginId, final Set identificationKeys, - final Integer pipelineWorkerThreads) { - if (pipelinePeerForwarderReceiveBufferMap.containsKey(pipelineName) && - pipelinePeerForwarderReceiveBufferMap.get(pipelineName).containsKey(pluginId)) { - throw new RuntimeException("Data Prepper 2.0 will only support a single peer-forwarder per pipeline/plugin type"); - } - - final PeerForwarderReceiveBuffer> peerForwarderReceiveBuffer = createBufferPerPipelineProcessor(pipelineName, pluginId); - - if (isPeerForwardingRequired()) { - if (hashRing == null) { - hashRing = peerForwarderClientFactory.createHashRing(); - } - return new RemotePeerForwarder( - peerForwarderClient, - hashRing, - peerForwarderReceiveBuffer, - pipelineName, - pluginId, - identificationKeys, - pluginMetrics, - peerForwarderConfiguration.getBatchDelay(), - peerForwarderConfiguration.getFailedForwardingRequestLocalWriteTimeout(), - peerForwarderConfiguration.getForwardingBatchSize(), - peerForwarderConfiguration.getForwardingBatchQueueDepth(), - peerForwarderConfiguration.getForwardingBatchTimeout(), - pipelineWorkerThreads - ); - } - else { - return new LocalPeerForwarder(); - } - } - - private PeerForwarderReceiveBuffer> createBufferPerPipelineProcessor(final String pipelineName, final String pluginId) { - final PeerForwarderReceiveBuffer> peerForwarderReceiveBuffer = new - PeerForwarderReceiveBuffer<>(peerForwarderConfiguration.getBufferSize(), peerForwarderConfiguration.getBatchSize(), pipelineName, pluginId); - - final Map>> pluginsBufferMap = - pipelinePeerForwarderReceiveBufferMap.computeIfAbsent(pipelineName, k -> new HashMap<>()); - - pluginsBufferMap.put(pluginId, peerForwarderReceiveBuffer); - - return peerForwarderReceiveBuffer; - } - - public boolean isPeerForwardingRequired() { - return arePeersConfigured() && pipelinePeerForwarderReceiveBufferMap.size() > 0; - } - - private boolean arePeersConfigured() { - final DiscoveryMode discoveryMode = peerForwarderConfiguration.getDiscoveryMode(); - if (discoveryMode.equals(DiscoveryMode.LOCAL_NODE)) { - return false; - } - else if (discoveryMode.equals(DiscoveryMode.STATIC) && peerForwarderConfiguration.getStaticEndpoints().size() <= 1) { - return false; - } - return true; - } - - public Map>>> getPipelinePeerForwarderReceiveBufferMap() { - return pipelinePeerForwarderReceiveBufferMap; - } +public interface PeerForwarderProvider { + /** + * Registers a pipeline and identification keys + * + * @param pipelineName pipeline name + * @param processor processor + * @param pluginId plugin id + * @param identificationKeys identification keys + * @param pipelineWorkerThreads number of pipeline worker threads + * @return peer forwarder + * @since 2.9 + */ + PeerForwarder register(final String pipelineName, final Processor processor, final String pluginId, final Set identificationKeys, final Integer pipelineWorkerThreads); + + /** + * Returns if peer forwarding required + * + * @return returns if peer forwarding required or nto + * @since 2.9 + */ + boolean isPeerForwardingRequired(); + + /** + * Returns if peers configured + * + * @return returns if peers configured + * @since 2.9 + */ + boolean arePeersConfigured(); + + /** + * Returns pipeline peer forwarder receive buffer map + * + * @return Map of buffer per pipeline per pluginId + * @since 2.9 + */ + Map>>> getPipelinePeerForwarderReceiveBufferMap(); } + diff --git a/data-prepper-core/src/main/java/org/opensearch/dataprepper/peerforwarder/PeerForwardingProcessorDecorator.java b/data-prepper-core/src/main/java/org/opensearch/dataprepper/peerforwarder/PeerForwardingProcessorDecorator.java index 58a99aadae..038bdb28c5 100644 --- a/data-prepper-core/src/main/java/org/opensearch/dataprepper/peerforwarder/PeerForwardingProcessorDecorator.java +++ b/data-prepper-core/src/main/java/org/opensearch/dataprepper/peerforwarder/PeerForwardingProcessorDecorator.java @@ -67,7 +67,7 @@ public static List decorateProcessors( "Peer Forwarder Plugin: %s cannot have empty identification keys." + pluginId); } - final PeerForwarder peerForwarder = peerForwarderProvider.register(pipelineName, pluginId, identificationKeys, pipelineWorkerThreads); + final PeerForwarder peerForwarder = peerForwarderProvider.register(pipelineName, firstInnerProcessor, pluginId, identificationKeys, pipelineWorkerThreads); return processors.stream().map(processor -> new PeerForwardingProcessorDecorator(peerForwarder, processor)) .collect(Collectors.toList()); diff --git a/data-prepper-core/src/main/java/org/opensearch/dataprepper/pipeline/ProcessWorker.java b/data-prepper-core/src/main/java/org/opensearch/dataprepper/pipeline/ProcessWorker.java index 2178fd6bcc..b5538dfe73 100644 --- a/data-prepper-core/src/main/java/org/opensearch/dataprepper/pipeline/ProcessWorker.java +++ b/data-prepper-core/src/main/java/org/opensearch/dataprepper/pipeline/ProcessWorker.java @@ -100,7 +100,7 @@ private void processAcknowledgements(List inputEvents, Collection void route(final Collection allRecords, final DataFlowComponent dataFlowComponent, final Map> recordsToRoutes, @@ -37,7 +38,9 @@ void route(final Collection allRecords, final Set routesForEvent = recordsToRoutes .getOrDefault(record, Collections.emptySet()); - if (routesForEvent.stream().anyMatch(dataFlowComponentRoutes::contains)) { + if (routesForEvent.size() == 0 && dataFlowComponentRoutes.contains(DEFAULT_ROUTE)) { + recordsForComponent.add(getRecordStrategy.getRecord(record)); + } else if (routesForEvent.stream().anyMatch(dataFlowComponentRoutes::contains)) { recordsForComponent.add(getRecordStrategy.getRecord(record)); } } diff --git a/data-prepper-core/src/main/java/org/opensearch/dataprepper/pipeline/router/RouterCopyRecordStrategy.java b/data-prepper-core/src/main/java/org/opensearch/dataprepper/pipeline/router/RouterCopyRecordStrategy.java index 1bd2944c2e..b4982c5b07 100644 --- a/data-prepper-core/src/main/java/org/opensearch/dataprepper/pipeline/router/RouterCopyRecordStrategy.java +++ b/data-prepper-core/src/main/java/org/opensearch/dataprepper/pipeline/router/RouterCopyRecordStrategy.java @@ -16,6 +16,7 @@ import org.opensearch.dataprepper.model.event.JacksonEvent; import org.opensearch.dataprepper.model.event.EventFactory; import org.opensearch.dataprepper.model.event.EventHandle; +import org.opensearch.dataprepper.model.event.InternalEventHandle; import org.opensearch.dataprepper.model.event.EventBuilder; import org.opensearch.dataprepper.model.event.EventMetadata; import org.opensearch.dataprepper.model.event.DefaultEventHandle; @@ -65,8 +66,8 @@ private void acquireEventReference(final Record record) { } if (referencedRecords.contains(record) || ((routedRecords != null) && routedRecords.contains(record))) { EventHandle eventHandle = ((JacksonEvent)record.getData()).getEventHandle(); - if (eventHandle != null && eventHandle instanceof DefaultEventHandle) { - acknowledgementSetManager.acquireEventReference(eventHandle); + if (eventHandle != null && eventHandle instanceof InternalEventHandle) { + ((InternalEventHandle)eventHandle).acquireReference(); } } else if (!referencedRecords.contains(record)) { referencedRecords.add(record); diff --git a/data-prepper-core/src/test/java/org/opensearch/dataprepper/acknowledgements/AcknowledgementSetMonitorTests.java b/data-prepper-core/src/test/java/org/opensearch/dataprepper/acknowledgements/AcknowledgementSetMonitorTests.java index 6c85b5c4de..158841a44a 100644 --- a/data-prepper-core/src/test/java/org/opensearch/dataprepper/acknowledgements/AcknowledgementSetMonitorTests.java +++ b/data-prepper-core/src/test/java/org/opensearch/dataprepper/acknowledgements/AcknowledgementSetMonitorTests.java @@ -13,7 +13,6 @@ import org.junit.jupiter.api.extension.ExtendWith; import static org.mockito.Mockito.when; import static org.mockito.Mockito.mock; -import static org.mockito.Mockito.doAnswer; import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.MatcherAssert.assertThat; @@ -71,57 +70,4 @@ public void testMultipleAcknowledgementSets() { acknowledgementSetMonitor.run(); assertThat(acknowledgementSetMonitor.getSize(), equalTo(1)); } - - @Test - public void testAcknowledgementSetAcquireRelease() { - when(eventHandle1.getAcknowledgementSet()).thenReturn(acknowledgementSet1); - try { - doAnswer((i) -> {return null; }).when(acknowledgementSet1).acquire(eventHandle1); - } catch (Exception e){} - acknowledgementSetMonitor.add(acknowledgementSet1); - acknowledgementSetMonitor.acquire(eventHandle1); - acknowledgementSetMonitor.release(eventHandle1, true); - Thread shutdownThread = new Thread(() -> { - try { - Thread.sleep(DEFAULT_WAIT_TIME_MS); - } catch (Exception e){} - }); - shutdownThread.start(); - acknowledgementSetMonitor.run(); - assertThat(acknowledgementSetMonitor.getSize(), equalTo(0)); - } - - @Test - public void testAcknowledgementSetInvalidAcquire() { - acknowledgementSet2 = mock(DefaultAcknowledgementSet.class); - when(eventHandle1.getAcknowledgementSet()).thenReturn(acknowledgementSet2); - acknowledgementSetMonitor.add(acknowledgementSet1); - acknowledgementSetMonitor.acquire(eventHandle1); - Thread shutdownThread = new Thread(() -> { - try { - Thread.sleep(DEFAULT_WAIT_TIME_MS); - } catch (Exception e){} - }); - shutdownThread.start(); - acknowledgementSetMonitor.run(); - assertThat(acknowledgementSetMonitor.getSize(), equalTo(0)); - assertThat(acknowledgementSetMonitor.getNumInvalidAcquires(), equalTo(1)); - } - - @Test - public void testAcknowledgementSetInvalidRelease() { - acknowledgementSet2 = mock(DefaultAcknowledgementSet.class); - when(eventHandle1.getAcknowledgementSet()).thenReturn(acknowledgementSet2); - acknowledgementSetMonitor.add(acknowledgementSet1); - acknowledgementSetMonitor.release(eventHandle1, true); - Thread shutdownThread = new Thread(() -> { - try { - Thread.sleep(DEFAULT_WAIT_TIME_MS); - } catch (Exception e){} - }); - shutdownThread.start(); - acknowledgementSetMonitor.run(); - assertThat(acknowledgementSetMonitor.getSize(), equalTo(0)); - assertThat(acknowledgementSetMonitor.getNumInvalidReleases(), equalTo(1)); - } } diff --git a/data-prepper-core/src/test/java/org/opensearch/dataprepper/acknowledgements/DefaultAcknowledgementSetManagerTests.java b/data-prepper-core/src/test/java/org/opensearch/dataprepper/acknowledgements/DefaultAcknowledgementSetManagerTests.java index 1b87d6c849..a083f5ea85 100644 --- a/data-prepper-core/src/test/java/org/opensearch/dataprepper/acknowledgements/DefaultAcknowledgementSetManagerTests.java +++ b/data-prepper-core/src/test/java/org/opensearch/dataprepper/acknowledgements/DefaultAcknowledgementSetManagerTests.java @@ -14,6 +14,8 @@ import org.junit.jupiter.api.Test; import org.mockito.junit.jupiter.MockitoExtension; import org.junit.jupiter.api.extension.ExtendWith; +import static org.mockito.Mockito.doAnswer; +import static org.mockito.ArgumentMatchers.any; import org.mockito.Mock; import static org.awaitility.Awaitility.await; @@ -53,17 +55,27 @@ class DefaultAcknowledgementSetManagerTests { void setup() { currentRatio = 0; callbackExecutor = Executors.newScheduledThreadPool(2); + acknowledgementSetManager = createObjectUnderTest(); + AcknowledgementSet acknowledgementSet1 = acknowledgementSetManager.create((flag) -> { result = flag; }, TEST_TIMEOUT); event1 = mock(JacksonEvent.class); eventHandle1 = mock(DefaultEventHandle.class); + lenient().doAnswer(a -> { + Boolean result = (Boolean)a.getArgument(0); + acknowledgementSet1.release(eventHandle1, result); + return null; + }).when(eventHandle1).release(any(Boolean.class)); lenient().when(event1.getEventHandle()).thenReturn(eventHandle1); pluginMetrics = mock(PluginMetrics.class); event2 = mock(JacksonEvent.class); eventHandle2 = mock(DefaultEventHandle.class); + lenient().doAnswer(a -> { + Boolean result = (Boolean)a.getArgument(0); + acknowledgementSet1.release(eventHandle2, result); + return null; + }).when(eventHandle2).release(any(Boolean.class)); lenient().when(event2.getEventHandle()).thenReturn(eventHandle2); - acknowledgementSetManager = createObjectUnderTest(); - AcknowledgementSet acknowledgementSet1 = acknowledgementSetManager.create((flag) -> { result = flag; }, TEST_TIMEOUT); acknowledgementSet1.add(event1); acknowledgementSet1.add(event2); lenient().when(eventHandle1.getAcknowledgementSet()).thenReturn(acknowledgementSet1); @@ -77,8 +89,8 @@ DefaultAcknowledgementSetManager createObjectUnderTest() { @Test void testBasic() { - acknowledgementSetManager.releaseEventReference(eventHandle2, true); - acknowledgementSetManager.releaseEventReference(eventHandle1, true); + eventHandle2.release(true); + eventHandle1.release(true); await().atMost(TEST_TIMEOUT.multipliedBy(5)) .untilAsserted(() -> { assertThat(acknowledgementSetManager.getAcknowledgementSetMonitor().getSize(), equalTo(0)); @@ -88,7 +100,7 @@ void testBasic() { @Test void testExpirations() throws InterruptedException { - acknowledgementSetManager.releaseEventReference(eventHandle2, true); + eventHandle2.release(true); Thread.sleep(TEST_TIMEOUT.multipliedBy(5).toMillis()); assertThat(acknowledgementSetManager.getAcknowledgementSetMonitor().getSize(), equalTo(0)); await().atMost(TEST_TIMEOUT.multipliedBy(5)) @@ -99,17 +111,22 @@ void testExpirations() throws InterruptedException { @Test void testMultipleAcknowledgementSets() { + AcknowledgementSet acknowledgementSet2 = acknowledgementSetManager.create((flag) -> { result = flag; }, TEST_TIMEOUT); event3 = mock(JacksonEvent.class); eventHandle3 = mock(DefaultEventHandle.class); + doAnswer(a -> { + Boolean result = (Boolean)a.getArgument(0); + acknowledgementSet2.release(eventHandle3, result); + return null; + }).when(eventHandle3).release(any(Boolean.class)); lenient().when(event3.getEventHandle()).thenReturn(eventHandle3); - AcknowledgementSet acknowledgementSet2 = acknowledgementSetManager.create((flag) -> { result = flag; }, TEST_TIMEOUT); acknowledgementSet2.add(event3); lenient().when(eventHandle3.getAcknowledgementSet()).thenReturn(acknowledgementSet2); acknowledgementSet2.complete(); - acknowledgementSetManager.releaseEventReference(eventHandle2, true); - acknowledgementSetManager.releaseEventReference(eventHandle3, true); + eventHandle2.release(true); + eventHandle3.release(true); await().atMost(TEST_TIMEOUT.multipliedBy(5)) .untilAsserted(() -> { assertThat(acknowledgementSetManager.getAcknowledgementSetMonitor().getSize(), equalTo(0)); @@ -119,22 +136,42 @@ void testMultipleAcknowledgementSets() { @Test void testWithProgressCheckCallbacks() { + AcknowledgementSet acknowledgementSet2 = acknowledgementSetManager.create((flag) -> { result = flag; }, Duration.ofMillis(10000)); eventHandle3 = mock(DefaultEventHandle.class); + doAnswer(a -> { + Boolean result = (Boolean)a.getArgument(0); + acknowledgementSet2.release(eventHandle3, result); + return null; + }).when(eventHandle3).release(any(Boolean.class)); lenient().when(event3.getEventHandle()).thenReturn(eventHandle3); eventHandle4 = mock(DefaultEventHandle.class); + doAnswer(a -> { + Boolean result = (Boolean)a.getArgument(0); + acknowledgementSet2.release(eventHandle4, result); + return null; + }).when(eventHandle4).release(any(Boolean.class)); JacksonEvent event4 = mock(JacksonEvent.class); lenient().when(event4.getEventHandle()).thenReturn(eventHandle4); eventHandle5 = mock(DefaultEventHandle.class); + doAnswer(a -> { + Boolean result = (Boolean)a.getArgument(0); + acknowledgementSet2.release(eventHandle5, result); + return null; + }).when(eventHandle5).release(any(Boolean.class)); JacksonEvent event5 = mock(JacksonEvent.class); lenient().when(event5.getEventHandle()).thenReturn(eventHandle5); eventHandle6 = mock(DefaultEventHandle.class); + doAnswer(a -> { + Boolean result = (Boolean)a.getArgument(0); + acknowledgementSet2.release(eventHandle6, result); + return null; + }).when(eventHandle6).release(any(Boolean.class)); JacksonEvent event6 = mock(JacksonEvent.class); lenient().when(event6.getEventHandle()).thenReturn(eventHandle6); - AcknowledgementSet acknowledgementSet2 = acknowledgementSetManager.create((flag) -> { result = flag; }, Duration.ofMillis(10000)); acknowledgementSet2.addProgressCheck((progressCheck) -> {currentRatio = progressCheck.getRatio();}, Duration.ofSeconds(1)); acknowledgementSet2.add(event3); acknowledgementSet2.add(event4); @@ -145,22 +182,22 @@ void testWithProgressCheckCallbacks() { lenient().when(eventHandle5.getAcknowledgementSet()).thenReturn(acknowledgementSet2); lenient().when(eventHandle6.getAcknowledgementSet()).thenReturn(acknowledgementSet2); acknowledgementSet2.complete(); - acknowledgementSetManager.releaseEventReference(eventHandle3, true); + eventHandle3.release(true); await().atMost(TEST_TIMEOUT.multipliedBy(5)) .untilAsserted(() -> { assertThat(currentRatio, equalTo(0.75)); }); - acknowledgementSetManager.releaseEventReference(eventHandle4, true); + eventHandle4.release(true); await().atMost(TEST_TIMEOUT.multipliedBy(5)) .untilAsserted(() -> { assertThat(currentRatio, equalTo(0.5)); }); - acknowledgementSetManager.releaseEventReference(eventHandle5, true); + eventHandle5.release(true); await().atMost(TEST_TIMEOUT.multipliedBy(5)) .untilAsserted(() -> { assertThat(currentRatio, equalTo(0.25)); }); - acknowledgementSetManager.releaseEventReference(eventHandle6, true); + eventHandle6.release(true); await().atMost(TEST_TIMEOUT.multipliedBy(5)) .untilAsserted(() -> { assertThat(result, equalTo(true)); @@ -170,14 +207,30 @@ void testWithProgressCheckCallbacks() { @Test void testWithProgressCheckCallbacks_AcksExpire() { + AcknowledgementSet acknowledgementSet2 = acknowledgementSetManager.create((flag) -> { result = flag; }, Duration.ofSeconds(10)); eventHandle3 = mock(DefaultEventHandle.class); + doAnswer(a -> { + Boolean result = (Boolean)a.getArgument(0); + acknowledgementSet2.release(eventHandle3, result); + return null; + }).when(eventHandle3).release(any(Boolean.class)); lenient().when(event3.getEventHandle()).thenReturn(eventHandle3); eventHandle4 = mock(DefaultEventHandle.class); + doAnswer(a -> { + Boolean result = (Boolean)a.getArgument(0); + acknowledgementSet2.release(eventHandle4, result); + return null; + }).when(eventHandle4).release(any(Boolean.class)); JacksonEvent event4 = mock(JacksonEvent.class); lenient().when(event4.getEventHandle()).thenReturn(eventHandle4); eventHandle5 = mock(DefaultEventHandle.class); + doAnswer(a -> { + Boolean result = (Boolean)a.getArgument(0); + acknowledgementSet2.release(eventHandle5, result); + return null; + }).when(eventHandle5).release(any(Boolean.class)); JacksonEvent event5 = mock(JacksonEvent.class); lenient().when(event5.getEventHandle()).thenReturn(eventHandle5); @@ -185,7 +238,6 @@ void testWithProgressCheckCallbacks_AcksExpire() { JacksonEvent event6 = mock(JacksonEvent.class); lenient().when(event6.getEventHandle()).thenReturn(eventHandle6); - AcknowledgementSet acknowledgementSet2 = acknowledgementSetManager.create((flag) -> { result = flag; }, Duration.ofSeconds(10)); acknowledgementSet2.addProgressCheck((progressCheck) -> {currentRatio = progressCheck.getRatio();}, Duration.ofSeconds(1)); acknowledgementSet2.add(event3); acknowledgementSet2.add(event4); @@ -196,17 +248,17 @@ void testWithProgressCheckCallbacks_AcksExpire() { lenient().when(eventHandle5.getAcknowledgementSet()).thenReturn(acknowledgementSet2); lenient().when(eventHandle6.getAcknowledgementSet()).thenReturn(acknowledgementSet2); acknowledgementSet2.complete(); - acknowledgementSetManager.releaseEventReference(eventHandle3, true); + eventHandle3.release(true); await().atMost(TEST_TIMEOUT.multipliedBy(5)) .untilAsserted(() -> { assertThat(currentRatio, equalTo(0.75)); }); - acknowledgementSetManager.releaseEventReference(eventHandle4, true); + eventHandle4.release(true); await().atMost(TEST_TIMEOUT.multipliedBy(5)) .untilAsserted(() -> { assertThat(currentRatio, equalTo(0.5)); }); - acknowledgementSetManager.releaseEventReference(eventHandle5, true); + eventHandle5.release(true); await().atMost(TEST_TIMEOUT.multipliedBy(5)) .untilAsserted(() -> { assertThat(currentRatio, equalTo(0.25)); diff --git a/data-prepper-core/src/test/java/org/opensearch/dataprepper/acknowledgements/DefaultAcknowledgementSetTests.java b/data-prepper-core/src/test/java/org/opensearch/dataprepper/acknowledgements/DefaultAcknowledgementSetTests.java index 28e17d77cc..a3ee665adf 100644 --- a/data-prepper-core/src/test/java/org/opensearch/dataprepper/acknowledgements/DefaultAcknowledgementSetTests.java +++ b/data-prepper-core/src/test/java/org/opensearch/dataprepper/acknowledgements/DefaultAcknowledgementSetTests.java @@ -91,14 +91,14 @@ void setupEvent() { AcknowledgementSet acknowledgementSet = a.getArgument(0); lenient().when(handle.getAcknowledgementSet()).thenReturn(acknowledgementSet); return null; - }).when(handle).setAcknowledgementSet(any(AcknowledgementSet.class)); + }).when(handle).addAcknowledgementSet(any(AcknowledgementSet.class)); lenient().when(event.getEventHandle()).thenReturn(handle); event2 = mock(JacksonEvent.class); lenient().doAnswer(a -> { AcknowledgementSet acknowledgementSet = a.getArgument(0); lenient().when(handle2.getAcknowledgementSet()).thenReturn(acknowledgementSet); return null; - }).when(handle2).setAcknowledgementSet(any(AcknowledgementSet.class)); + }).when(handle2).addAcknowledgementSet(any(AcknowledgementSet.class)); handle2 = mock(DefaultEventHandle.class); lenient().when(event2.getEventHandle()).thenReturn(handle2); } @@ -186,7 +186,7 @@ void testDefaultAcknowledgementSetNegativeAcknowledgements() throws Exception { AcknowledgementSet acknowledgementSet = a.getArgument(0); lenient().when(handle.getAcknowledgementSet()).thenReturn(acknowledgementSet); return null; - }).when(handle).setAcknowledgementSet(any(AcknowledgementSet.class)); + }).when(handle).addAcknowledgementSet(any(AcknowledgementSet.class)); assertThat(handle.getAcknowledgementSet(), equalTo(defaultAcknowledgementSet)); defaultAcknowledgementSet.acquire(handle); assertThat(defaultAcknowledgementSet.release(handle, true), equalTo(false)); @@ -219,7 +219,7 @@ void testDefaultAcknowledgementSetExpirations() throws Exception { AcknowledgementSet acknowledgementSet = a.getArgument(0); lenient().when(handle.getAcknowledgementSet()).thenReturn(acknowledgementSet); return null; - }).when(handle).setAcknowledgementSet(any(AcknowledgementSet.class)); + }).when(handle).addAcknowledgementSet(any(AcknowledgementSet.class)); assertThat(handle, not(equalTo(null))); assertThat(handle.getAcknowledgementSet(), equalTo(defaultAcknowledgementSet)); assertThat(defaultAcknowledgementSet.release(handle, true), equalTo(true)); @@ -253,7 +253,7 @@ void testDefaultAcknowledgementSetWithProgressCheck() throws Exception { AcknowledgementSet acknowledgementSet = a.getArgument(0); lenient().when(handle.getAcknowledgementSet()).thenReturn(acknowledgementSet); return null; - }).when(handle).setAcknowledgementSet(any(AcknowledgementSet.class)); + }).when(handle).addAcknowledgementSet(any(AcknowledgementSet.class)); assertThat(handle, not(equalTo(null))); assertThat(handle.getAcknowledgementSet(), equalTo(defaultAcknowledgementSet)); await().atMost(Duration.ofSeconds(5)) diff --git a/data-prepper-core/src/test/java/org/opensearch/dataprepper/acknowledgements/InactiveAcknowledgementSetManagerTests.java b/data-prepper-core/src/test/java/org/opensearch/dataprepper/acknowledgements/InactiveAcknowledgementSetManagerTests.java index eb1303d487..8a0a4d2ffd 100644 --- a/data-prepper-core/src/test/java/org/opensearch/dataprepper/acknowledgements/InactiveAcknowledgementSetManagerTests.java +++ b/data-prepper-core/src/test/java/org/opensearch/dataprepper/acknowledgements/InactiveAcknowledgementSetManagerTests.java @@ -7,12 +7,9 @@ import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Test; -import static org.mockito.Mockito.mock; import static org.hamcrest.MatcherAssert.assertThat; import static org.hamcrest.CoreMatchers.notNullValue; import static org.junit.jupiter.api.Assertions.assertThrows; -import org.opensearch.dataprepper.model.event.EventHandle; -import org.opensearch.dataprepper.model.event.Event; import java.time.Duration; @@ -30,25 +27,4 @@ void testCreateAPI() { assertThrows(UnsupportedOperationException.class, () -> acknowledgementSetManager.create((a)->{}, Duration.ofMillis(10))); } - @Test - void testEventAcquireAPI() { - assertThat(acknowledgementSetManager, notNullValue()); - Event event = mock(Event.class); - assertThrows(UnsupportedOperationException.class, () -> acknowledgementSetManager.acquireEventReference(event)); - } - - @Test - void testEventHandleAcquireAPI() { - assertThat(acknowledgementSetManager, notNullValue()); - EventHandle eventHandle = mock(EventHandle.class); - assertThrows(UnsupportedOperationException.class, () -> acknowledgementSetManager.acquireEventReference(eventHandle)); - } - - @Test - void testReleaseAPI() { - assertThat(acknowledgementSetManager, notNullValue()); - EventHandle eventHandle = mock(EventHandle.class); - assertThrows(UnsupportedOperationException.class, () -> acknowledgementSetManager.releaseEventReference(eventHandle, true)); - } - } diff --git a/data-prepper-core/src/test/java/org/opensearch/dataprepper/peerforwarder/PeerForwarderProviderTest.java b/data-prepper-core/src/test/java/org/opensearch/dataprepper/peerforwarder/DefaultPeerForwarderProviderTest.java similarity index 82% rename from data-prepper-core/src/test/java/org/opensearch/dataprepper/peerforwarder/PeerForwarderProviderTest.java rename to data-prepper-core/src/test/java/org/opensearch/dataprepper/peerforwarder/DefaultPeerForwarderProviderTest.java index 08964d3a80..4c1c36482c 100644 --- a/data-prepper-core/src/test/java/org/opensearch/dataprepper/peerforwarder/PeerForwarderProviderTest.java +++ b/data-prepper-core/src/test/java/org/opensearch/dataprepper/peerforwarder/DefaultPeerForwarderProviderTest.java @@ -13,6 +13,7 @@ import org.junit.jupiter.api.extension.ExtendWith; import org.mockito.Mock; import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.dataprepper.model.processor.Processor; import org.opensearch.dataprepper.peerforwarder.client.PeerForwarderClient; import org.opensearch.dataprepper.peerforwarder.discovery.DiscoveryMode; @@ -35,7 +36,7 @@ import static org.mockito.Mockito.when; @ExtendWith(MockitoExtension.class) -class PeerForwarderProviderTest { +class DefaultPeerForwarderProviderTest { private static final int PIPELINE_WORKER_THREADS = new Random().nextInt(10) + 1; @Mock @@ -50,6 +51,9 @@ class PeerForwarderProviderTest { @Mock private HashRing hashRing; + @Mock + private Processor processor; + @Mock private PluginMetrics pluginMetrics; @@ -71,13 +75,13 @@ void setUp() { } private PeerForwarderProvider createObjectUnderTest() { - return new PeerForwarderProvider(peerForwarderClientFactory, peerForwarderClient, peerForwarderConfiguration, pluginMetrics); + return new DefaultPeerForwarderProvider(peerForwarderClientFactory, peerForwarderClient, peerForwarderConfiguration, pluginMetrics); } @Test void register_creates_a_new_RemotePeerForwarder_with_cloud_map_discovery_mode() { when(peerForwarderConfiguration.getDiscoveryMode()).thenReturn(DiscoveryMode.AWS_CLOUD_MAP); - final PeerForwarder peerForwarder = createObjectUnderTest().register(pipelineName, pluginId, identificationKeys, PIPELINE_WORKER_THREADS); + final PeerForwarder peerForwarder = createObjectUnderTest().register(pipelineName, processor, pluginId, identificationKeys, PIPELINE_WORKER_THREADS); assertThat(peerForwarder, instanceOf(RemotePeerForwarder.class)); } @@ -86,7 +90,7 @@ void register_creates_a_new_RemotePeerForwarder_with_cloud_map_discovery_mode() void register_creates_a_new_RemotePeerForwarder_with_static_discovery_mode_of_size_grater_than_one() { when(peerForwarderConfiguration.getDiscoveryMode()).thenReturn(DiscoveryMode.STATIC); when(peerForwarderConfiguration.getStaticEndpoints()).thenReturn(List.of("endpoint1", "endpoint2")); - final PeerForwarder peerForwarder = createObjectUnderTest().register(pipelineName, pluginId, identificationKeys, PIPELINE_WORKER_THREADS); + final PeerForwarder peerForwarder = createObjectUnderTest().register(pipelineName, processor, pluginId, identificationKeys, PIPELINE_WORKER_THREADS); assertThat(peerForwarder, instanceOf(RemotePeerForwarder.class)); } @@ -95,14 +99,14 @@ void register_creates_a_new_RemotePeerForwarder_with_static_discovery_mode_of_si void register_creates_a_new_RemotePeerForwarder_with_static_discovery_mode_of_size_one() { when(peerForwarderConfiguration.getDiscoveryMode()).thenReturn(DiscoveryMode.STATIC); when(peerForwarderConfiguration.getStaticEndpoints()).thenReturn(List.of("endpoint1")); - final PeerForwarder peerForwarder = createObjectUnderTest().register(pipelineName, pluginId, identificationKeys, PIPELINE_WORKER_THREADS); + final PeerForwarder peerForwarder = createObjectUnderTest().register(pipelineName, processor, pluginId, identificationKeys, PIPELINE_WORKER_THREADS); assertThat(peerForwarder, instanceOf(LocalPeerForwarder.class)); } @Test void register_creates_a_new_LocalPeerForwarder_with_local_discovery_mode() { - final PeerForwarder peerForwarder = createObjectUnderTest().register(pipelineName, pluginId, identificationKeys, PIPELINE_WORKER_THREADS); + final PeerForwarder peerForwarder = createObjectUnderTest().register(pipelineName, processor, pluginId, identificationKeys, PIPELINE_WORKER_THREADS); assertThat(peerForwarder, instanceOf(LocalPeerForwarder.class)); } @@ -110,7 +114,7 @@ void register_creates_a_new_LocalPeerForwarder_with_local_discovery_mode() { @Test void register_creates_HashRing_if_peer_forwarding_is_required() { when(peerForwarderConfiguration.getDiscoveryMode()).thenReturn(DiscoveryMode.AWS_CLOUD_MAP); - createObjectUnderTest().register(pipelineName, pluginId, identificationKeys, PIPELINE_WORKER_THREADS); + createObjectUnderTest().register(pipelineName, processor, pluginId, identificationKeys, PIPELINE_WORKER_THREADS); verify(peerForwarderClientFactory).createHashRing(); } @@ -121,7 +125,7 @@ void register_called_multiple_times_creates_only_one_HashRing_if_peer_forwarding final PeerForwarderProvider objectUnderTest = createObjectUnderTest(); for (int i = 0; i < 10; i++) - objectUnderTest.register(pipelineName, UUID.randomUUID().toString(), identificationKeys, PIPELINE_WORKER_THREADS); + objectUnderTest.register(pipelineName, processor, UUID.randomUUID().toString(), identificationKeys, PIPELINE_WORKER_THREADS); verify(peerForwarderClientFactory, times(1)).createHashRing(); } @@ -137,17 +141,17 @@ void isAtLeastOnePeerForwarderRegistered_should_return_false_if_register_is_not_ void isAtLeastOnePeerForwarderRegistered_should_throw_when_register_is_called_with_same_pipeline_and_plugin() { final PeerForwarderProvider objectUnderTest = createObjectUnderTest(); - objectUnderTest.register(pipelineName, pluginId, identificationKeys, PIPELINE_WORKER_THREADS); + objectUnderTest.register(pipelineName, processor, pluginId, identificationKeys, PIPELINE_WORKER_THREADS); assertThrows(RuntimeException.class, () -> - objectUnderTest.register(pipelineName, pluginId, identificationKeys, PIPELINE_WORKER_THREADS)); + objectUnderTest.register(pipelineName, processor, pluginId, identificationKeys, PIPELINE_WORKER_THREADS)); } @Test void isAtLeastOnePeerForwarderRegistered_should_return_false_if_register_is_called_with_local_discovery_mode() { final PeerForwarderProvider objectUnderTest = createObjectUnderTest(); - objectUnderTest.register(pipelineName, pluginId, identificationKeys, PIPELINE_WORKER_THREADS); + objectUnderTest.register(pipelineName, processor, pluginId, identificationKeys, PIPELINE_WORKER_THREADS); assertThat(objectUnderTest.isPeerForwardingRequired(), equalTo(false)); } @@ -157,7 +161,7 @@ void isAtLeastOnePeerForwarderRegistered_should_return_true_if_register_is_calle when(peerForwarderConfiguration.getDiscoveryMode()).thenReturn(DiscoveryMode.AWS_CLOUD_MAP); final PeerForwarderProvider objectUnderTest = createObjectUnderTest(); - objectUnderTest.register(pipelineName, pluginId, identificationKeys, PIPELINE_WORKER_THREADS); + objectUnderTest.register(pipelineName, processor, pluginId, identificationKeys, PIPELINE_WORKER_THREADS); assertThat(objectUnderTest.isPeerForwardingRequired(), equalTo(true)); } @@ -179,7 +183,7 @@ void getPipelinePeerForwarderReceiveBufferMap_should_return_empty_map_when_regis void getPipelinePeerForwarderReceiveBufferMap_should_return_non_empty_map_when_register_is_called() { final PeerForwarderProvider objectUnderTest = createObjectUnderTest(); - objectUnderTest.register(pipelineName, UUID.randomUUID().toString(), identificationKeys, PIPELINE_WORKER_THREADS); + objectUnderTest.register(pipelineName, processor, UUID.randomUUID().toString(), identificationKeys, PIPELINE_WORKER_THREADS); final Map>>> pipelinePeerForwarderReceiveBufferMap = objectUnderTest .getPipelinePeerForwarderReceiveBufferMap(); @@ -189,4 +193,4 @@ void getPipelinePeerForwarderReceiveBufferMap_should_return_non_empty_map_when_r assertThat(pipelinePeerForwarderReceiveBufferMap.size(), equalTo(1)); assertThat(pipelinePeerForwarderReceiveBufferMap.containsKey(pipelineName), equalTo(true)); } -} \ No newline at end of file +} diff --git a/data-prepper-core/src/test/java/org/opensearch/dataprepper/peerforwarder/PeerForwarder_ClientServerIT.java b/data-prepper-core/src/test/java/org/opensearch/dataprepper/peerforwarder/PeerForwarder_ClientServerIT.java index 2b4d875e45..f706cb97d7 100644 --- a/data-prepper-core/src/test/java/org/opensearch/dataprepper/peerforwarder/PeerForwarder_ClientServerIT.java +++ b/data-prepper-core/src/test/java/org/opensearch/dataprepper/peerforwarder/PeerForwarder_ClientServerIT.java @@ -22,6 +22,7 @@ import org.opensearch.dataprepper.model.event.Event; import org.opensearch.dataprepper.model.event.JacksonEvent; import org.opensearch.dataprepper.model.record.Record; +import org.opensearch.dataprepper.model.processor.Processor; import org.opensearch.dataprepper.peerforwarder.certificate.CertificateProviderFactory; import org.opensearch.dataprepper.peerforwarder.client.PeerForwarderClient; import org.opensearch.dataprepper.peerforwarder.codec.PeerForwarderCodecAppConfig; @@ -133,7 +134,7 @@ private PeerForwarderProvider createPeerForwarderProvider( final PeerForwarderClient clientForProvider = createClient(peerForwarderConfiguration); final PeerClientPool peerClientPool = new PeerClientPool(); final PeerForwarderClientFactory clientFactoryForProvider = new PeerForwarderClientFactory(peerForwarderConfiguration, peerClientPool, certificateProviderFactory, pluginMetrics); - return new PeerForwarderProvider(clientFactoryForProvider, clientForProvider, peerForwarderConfiguration, pluginMetrics); + return new DefaultPeerForwarderProvider(clientFactoryForProvider, clientForProvider, peerForwarderConfiguration, pluginMetrics); } private PeerForwarderClient createClient( @@ -160,6 +161,7 @@ private Collection> getServerSideRecords(final PeerForwarderProvid class WithSSL { private PeerForwarderServer server; private PeerForwarderProvider peerForwarderProvider; + private Processor processor; void setUpServer(final boolean binaryCodec) { peerForwarderConfiguration = createConfiguration(true, ForwardingAuthentication.UNAUTHENTICATED, binaryCodec); @@ -168,7 +170,7 @@ void setUpServer(final boolean binaryCodec) { final CertificateProviderFactory certificateProviderFactory = new CertificateProviderFactory(peerForwarderConfiguration); peerForwarderProvider = createPeerForwarderProvider(peerForwarderConfiguration, certificateProviderFactory); - peerForwarderProvider.register(pipelineName, pluginId, Collections.singleton(UUID.randomUUID().toString()), PIPELINE_WORKER_THREADS); + peerForwarderProvider.register(pipelineName, processor, pluginId, Collections.singleton(UUID.randomUUID().toString()), PIPELINE_WORKER_THREADS); server = createServer(peerForwarderConfiguration, certificateProviderFactory, peerForwarderProvider); server.start(); } @@ -280,6 +282,7 @@ void send_Events_with_fingerprint_verification_to_unknown_server_should_throw(fi class WithoutSSL { private PeerForwarderServer server; private PeerForwarderProvider peerForwarderProvider; + private Processor processor; void setUpServer(final boolean binaryCodec) { peerForwarderConfiguration = createConfiguration(false, ForwardingAuthentication.UNAUTHENTICATED, binaryCodec); @@ -288,7 +291,7 @@ void setUpServer(final boolean binaryCodec) { final CertificateProviderFactory certificateProviderFactory = new CertificateProviderFactory(peerForwarderConfiguration); peerForwarderProvider = createPeerForwarderProvider(peerForwarderConfiguration, certificateProviderFactory); - peerForwarderProvider.register(pipelineName, pluginId, Collections.singleton(UUID.randomUUID().toString()), PIPELINE_WORKER_THREADS); + peerForwarderProvider.register(pipelineName, processor, pluginId, Collections.singleton(UUID.randomUUID().toString()), PIPELINE_WORKER_THREADS); server = createServer(peerForwarderConfiguration, certificateProviderFactory, peerForwarderProvider); server.start(); } @@ -339,6 +342,7 @@ void send_Events_to_server_when_expecting_SSL_should_throw(final boolean binaryC class WithMutualTls { private PeerForwarderServer server; private PeerForwarderProvider peerForwarderProvider; + private Processor processor; void setUpServer(final boolean binaryCodec) { peerForwarderConfiguration = createConfiguration(true, ForwardingAuthentication.MUTUAL_TLS, binaryCodec); @@ -347,7 +351,7 @@ void setUpServer(final boolean binaryCodec) { final CertificateProviderFactory certificateProviderFactory = new CertificateProviderFactory(peerForwarderConfiguration); peerForwarderProvider = createPeerForwarderProvider(peerForwarderConfiguration, certificateProviderFactory); - peerForwarderProvider.register(pipelineName, pluginId, Collections.singleton(UUID.randomUUID().toString()), PIPELINE_WORKER_THREADS); + peerForwarderProvider.register(pipelineName, processor, pluginId, Collections.singleton(UUID.randomUUID().toString()), PIPELINE_WORKER_THREADS); server = createServer(peerForwarderConfiguration, certificateProviderFactory, peerForwarderProvider); server.start(); } diff --git a/data-prepper-core/src/test/java/org/opensearch/dataprepper/peerforwarder/PeerForwardingProcessingDecoratorTest.java b/data-prepper-core/src/test/java/org/opensearch/dataprepper/peerforwarder/PeerForwardingProcessingDecoratorTest.java index d0c71a52d0..7a85033842 100644 --- a/data-prepper-core/src/test/java/org/opensearch/dataprepper/peerforwarder/PeerForwardingProcessingDecoratorTest.java +++ b/data-prepper-core/src/test/java/org/opensearch/dataprepper/peerforwarder/PeerForwardingProcessingDecoratorTest.java @@ -17,6 +17,8 @@ import org.apache.commons.collections.CollectionUtils; import org.mockito.Mock; import org.mockito.junit.jupiter.MockitoExtension; +import static org.mockito.ArgumentMatchers.any; +import static org.mockito.Mockito.lenient; import org.opensearch.dataprepper.peerforwarder.exception.EmptyPeerForwarderPluginIdentificationKeysException; import org.opensearch.dataprepper.peerforwarder.exception.UnsupportedPeerForwarderPluginException; @@ -37,6 +39,7 @@ import static org.mockito.Mockito.times; import static org.mockito.Mockito.verify; import static org.mockito.Mockito.verifyNoInteractions; +import static org.mockito.Mockito.verifyNoMoreInteractions; import static org.mockito.Mockito.when; @ExtendWith(MockitoExtension.class) @@ -68,13 +71,13 @@ record = mock(Record.class); pluginId = UUID.randomUUID().toString(); } - private List createObjectUnderTesDecoratedProcessors(final List processors) { + private List createObjectUnderTestDecoratedProcessors(final List processors) { return PeerForwardingProcessorDecorator.decorateProcessors(processors, peerForwarderProvider, pipelineName, pluginId, PIPELINE_WORKER_THREADS); } @Test void PeerForwardingProcessingDecorator_should_not_have_any_interactions_if_its_not_an_instance_of_RequiresPeerForwarding() { - assertThrows(UnsupportedPeerForwarderPluginException.class, () -> createObjectUnderTesDecoratedProcessors(Collections.singletonList(processor))); + assertThrows(UnsupportedPeerForwarderPluginException.class, () -> createObjectUnderTestDecoratedProcessors(Collections.singletonList(processor))); verifyNoInteractions(peerForwarderProvider); } @@ -83,7 +86,7 @@ void PeerForwardingProcessingDecorator_should_not_have_any_interactions_if_its_n void PeerForwardingProcessingDecorator_execute_with_empty_identification_keys_should_throw() { when(requiresPeerForwarding.getIdentificationKeys()).thenReturn(Collections.emptySet()); - assertThrows(EmptyPeerForwarderPluginIdentificationKeysException.class, () -> createObjectUnderTesDecoratedProcessors(Collections.singletonList((Processor) requiresPeerForwarding))); + assertThrows(EmptyPeerForwarderPluginIdentificationKeysException.class, () -> createObjectUnderTestDecoratedProcessors(Collections.singletonList((Processor) requiresPeerForwarding))); } @Test @@ -95,12 +98,12 @@ void decorateProcessors_with_different_identification_key_should_throw() { when(requiresPeerForwarding.getIdentificationKeys()).thenReturn(Set.of(UUID.randomUUID().toString())); when(requiresPeerForwardingCopy.getIdentificationKeys()).thenReturn(Set.of(UUID.randomUUID().toString())); - assertThrows(RuntimeException.class, () -> createObjectUnderTesDecoratedProcessors(List.of(((Processor) requiresPeerForwarding), (Processor) requiresPeerForwardingCopy))); + assertThrows(RuntimeException.class, () -> createObjectUnderTestDecoratedProcessors(List.of(((Processor) requiresPeerForwarding), (Processor) requiresPeerForwardingCopy))); } @Test void decorateProcessors_with_empty_processors_should_return_empty_list_of_processors() { - final List processors = createObjectUnderTesDecoratedProcessors(Collections.emptyList()); + final List processors = createObjectUnderTestDecoratedProcessors(Collections.emptyList()); assertThat(processors.size(), equalTo(0)); } @@ -115,16 +118,66 @@ class WithRegisteredPeerForwarder { void setUp() { identificationKeys = Set.of(TEST_IDENTIFICATION_KEY); - when(peerForwarderProvider.register(pipelineName, pluginId, identificationKeys, PIPELINE_WORKER_THREADS)).thenReturn(peerForwarder); - when(requiresPeerForwarding.getIdentificationKeys()).thenReturn(identificationKeys); processor = (Processor) requiresPeerForwarding; + lenient().when(peerForwarderProvider.register(pipelineName, processor, pluginId, identificationKeys, PIPELINE_WORKER_THREADS)).thenReturn(peerForwarder); + when(requiresPeerForwarding.getIdentificationKeys()).thenReturn(identificationKeys); } @Test void PeerForwardingProcessingDecorator_should_have_interaction_with_getIdentificationKeys() { - createObjectUnderTesDecoratedProcessors(Collections.singletonList(processor)); + createObjectUnderTestDecoratedProcessors(Collections.singletonList(processor)); + verify(requiresPeerForwarding, times(2)).getIdentificationKeys(); + verify(peerForwarderProvider).register(pipelineName, processor, pluginId, identificationKeys, PIPELINE_WORKER_THREADS); + verifyNoMoreInteractions(peerForwarderProvider); + } + + @Test + void PeerForwardingProcessingDecorator_should_have_interaction_with_getIdentificationKeys_when_list_of_processors() { + when(requiresPeerForwarding.getIdentificationKeys()).thenReturn(identificationKeys); + when(requiresPeerForwardingCopy.getIdentificationKeys()).thenReturn(identificationKeys); + + createObjectUnderTestDecoratedProcessors(List.of((Processor) requiresPeerForwarding, (Processor) requiresPeerForwardingCopy)); + verify(requiresPeerForwarding, times(2)).getIdentificationKeys(); - verify(peerForwarderProvider).register(pipelineName, pluginId, identificationKeys, PIPELINE_WORKER_THREADS); + verify(peerForwarderProvider).register(pipelineName, processor, pluginId, identificationKeys, PIPELINE_WORKER_THREADS); + verifyNoMoreInteractions(peerForwarderProvider); + } + + @Test + void PeerForwardingProcessingDecorator_with_localProcessingOnly() { + List processorList = new ArrayList<>(); + processorList.add((Processor) requiresPeerForwarding); + processorList.add((Processor) requiresPeerForwardingCopy); + + LocalPeerForwarder localPeerForwarder = mock(LocalPeerForwarder.class); + when(peerForwarderProvider.register(pipelineName, (Processor) requiresPeerForwarding, pluginId, identificationKeys, PIPELINE_WORKER_THREADS)).thenReturn(localPeerForwarder); + Event event = mock(Event.class); + when(record.getData()).thenReturn(event); + List> testData = Collections.singletonList(record); + when(requiresPeerForwarding.isApplicableEventForPeerForwarding(event)).thenReturn(false); + when(requiresPeerForwardingCopy.isApplicableEventForPeerForwarding(event)).thenReturn(false); + + Processor processor1 = (Processor)requiresPeerForwarding; + Processor processor2 = (Processor)requiresPeerForwardingCopy; + when(processor1.execute(testData)).thenReturn(testData); + when(processor2.execute(testData)).thenReturn(testData); + + when(requiresPeerForwarding.getIdentificationKeys()).thenReturn(identificationKeys); + when(requiresPeerForwardingCopy.getIdentificationKeys()).thenReturn(identificationKeys); + + when(requiresPeerForwarding.isForLocalProcessingOnly(any())).thenReturn(true); + when(requiresPeerForwardingCopy.isForLocalProcessingOnly(any())).thenReturn(true); + + final List processors = createObjectUnderTestDecoratedProcessors(processorList); + assertThat(processors.size(), equalTo(2)); + verify(peerForwarderProvider, times(1)).register(pipelineName, processor, pluginId, identificationKeys, PIPELINE_WORKER_THREADS); + verifyNoMoreInteractions(peerForwarderProvider); + Collection> result = processors.get(0).execute(testData); + assertThat(result.size(), equalTo(testData.size())); + assertThat(result, equalTo(testData)); + result = processors.get(1).execute(testData); + assertThat(result.size(), equalTo(testData.size())); + assertThat(result, equalTo(testData)); } @Test @@ -138,7 +191,7 @@ void PeerForwardingProcessingDecorator_execute_should_forwardRecords_with_correc when(processor.execute(testData)).thenReturn(testData); - final List processors = createObjectUnderTesDecoratedProcessors(Collections.singletonList(processor)); + final List processors = createObjectUnderTestDecoratedProcessors(Collections.singletonList(processor)); assertThat(processors.size(), equalTo(1)); final Collection> records = processors.get(0).execute(testData); @@ -164,7 +217,7 @@ void PeerForwardingProcessingDecorator_execute_should_receiveRecords() { when(((Processor) requiresPeerForwarding).execute(anyCollection())).thenReturn(expectedRecordsToProcessLocally); - final List processors = createObjectUnderTesDecoratedProcessors(Collections.singletonList((Processor) requiresPeerForwarding)); + final List processors = createObjectUnderTestDecoratedProcessors(Collections.singletonList((Processor) requiresPeerForwarding)); assertThat(processors.size(), equalTo(1)); final Collection> records = processors.get(0).execute(forwardTestData); @@ -181,7 +234,7 @@ void PeerForwardingProcessingDecorator_execute_will_call_inner_processors_execut Event event = mock(Event.class); when(record.getData()).thenReturn(event); when(requiresPeerForwarding.isApplicableEventForPeerForwarding(event)).thenReturn(true); - final List processors = createObjectUnderTesDecoratedProcessors(Collections.singletonList(processor)); + final List processors = createObjectUnderTestDecoratedProcessors(Collections.singletonList(processor)); Collection> testData = Collections.singletonList(record); assertThat(processors.size(), equalTo(1)); @@ -195,9 +248,9 @@ void PeerForwardingProcessingDecorator_execute_will_call_inner_processors_execut Event event = mock(Event.class); when(record.getData()).thenReturn(event); when(requiresPeerForwarding.isApplicableEventForPeerForwarding(event)).thenReturn(false); - when(requiresPeerForwarding.isForLocalProcessingOnly(event)).thenReturn(true); + when(requiresPeerForwarding.isForLocalProcessingOnly(any())).thenReturn(true); - final List processors = createObjectUnderTesDecoratedProcessors(Collections.singletonList(processor)); + final List processors = createObjectUnderTestDecoratedProcessors(Collections.singletonList(processor)); Collection> testData = Collections.singletonList(record); assertThat(processors.size(), equalTo(1)); @@ -220,10 +273,8 @@ void PeerForwardingProcessingDecorator_inner_processor_with_is_applicable_event_ when(requiresPeerForwarding.isApplicableEventForPeerForwarding(event1)).thenReturn(false); when(requiresPeerForwarding.isApplicableEventForPeerForwarding(event2)).thenReturn(false); when(requiresPeerForwarding.isApplicableEventForPeerForwarding(event3)).thenReturn(false); - when(requiresPeerForwarding.isForLocalProcessingOnly(event1)).thenReturn(true); - when(requiresPeerForwarding.isForLocalProcessingOnly(event2)).thenReturn(true); - when(requiresPeerForwarding.isForLocalProcessingOnly(event3)).thenReturn(true); - final List processors = createObjectUnderTesDecoratedProcessors(Collections.singletonList(processor)); + when(requiresPeerForwarding.isForLocalProcessingOnly(any())).thenReturn(true); + final List processors = createObjectUnderTestDecoratedProcessors(Collections.singletonList(processor)); when(record1.getData()).thenReturn(event1); when(record2.getData()).thenReturn(event2); when(record3.getData()).thenReturn(event3); @@ -253,8 +304,8 @@ void PeerForwardingProcessingDecorator_inner_processor_with_is_applicable_event_ when(requiresPeerForwarding.isApplicableEventForPeerForwarding(event1)).thenReturn(true); when(requiresPeerForwarding.isApplicableEventForPeerForwarding(event2)).thenReturn(false); when(requiresPeerForwarding.isApplicableEventForPeerForwarding(event3)).thenReturn(true); - when(requiresPeerForwarding.isForLocalProcessingOnly(event2)).thenReturn(false); - final List processors = createObjectUnderTesDecoratedProcessors(Collections.singletonList(processor)); + when(requiresPeerForwarding.isForLocalProcessingOnly(any())).thenReturn(false); + final List processors = createObjectUnderTestDecoratedProcessors(Collections.singletonList(processor)); when(record1.getData()).thenReturn(event1); when(record2.getData()).thenReturn(event2); when(record3.getData()).thenReturn(event3); @@ -273,7 +324,7 @@ void PeerForwardingProcessingDecorator_inner_processor_with_is_applicable_event_ @Test void PeerForwardingProcessingDecorator_prepareForShutdown_will_call_inner_processors_prepareForShutdown() { - final List processors = createObjectUnderTesDecoratedProcessors(Collections.singletonList(processor)); + final List processors = createObjectUnderTestDecoratedProcessors(Collections.singletonList(processor)); assertThat(processors.size(), equalTo(1)); processors.get(0).prepareForShutdown(); @@ -282,7 +333,7 @@ void PeerForwardingProcessingDecorator_prepareForShutdown_will_call_inner_proces @Test void PeerForwardingProcessingDecorator_isReadyForShutdown_will_call_inner_processors_isReadyForShutdown() { - final List processors = createObjectUnderTesDecoratedProcessors(Collections.singletonList(processor)); + final List processors = createObjectUnderTestDecoratedProcessors(Collections.singletonList(processor)); assertThat(processors.size(), equalTo(1)); processors.get(0).isReadyForShutdown(); @@ -291,7 +342,7 @@ void PeerForwardingProcessingDecorator_isReadyForShutdown_will_call_inner_proces @Test void PeerForwardingProcessingDecorator_shutdown_will_call_inner_processors_shutdown() { - final List processors = createObjectUnderTesDecoratedProcessors(Collections.singletonList(processor)); + final List processors = createObjectUnderTestDecoratedProcessors(Collections.singletonList(processor)); assertThat(processors.size(), equalTo(1)); processors.get(0).shutdown(); diff --git a/data-prepper-core/src/test/java/org/opensearch/dataprepper/peerforwarder/codec/JavaPeerForwarderCodecTest.java b/data-prepper-core/src/test/java/org/opensearch/dataprepper/peerforwarder/codec/JavaPeerForwarderCodecTest.java index 70a1e737d8..bd0b26e05f 100644 --- a/data-prepper-core/src/test/java/org/opensearch/dataprepper/peerforwarder/codec/JavaPeerForwarderCodecTest.java +++ b/data-prepper-core/src/test/java/org/opensearch/dataprepper/peerforwarder/codec/JavaPeerForwarderCodecTest.java @@ -78,7 +78,7 @@ void testCodec_with_acknowledgementSet() throws IOException, ClassNotFoundExcept inputEvents.getEvents().stream() .map(Event::getEventHandle) .map(handle -> (InternalEventHandle)handle) - .forEach(handle -> handle.setAcknowledgementSet(mock(AcknowledgementSet.class))); + .forEach(handle -> handle.addAcknowledgementSet(mock(AcknowledgementSet.class))); final byte[] bytes = createObjectUnderTest().serialize(inputEvents); final PeerForwardingEvents outputEvents = createObjectUnderTest().deserialize(bytes); assertThat(outputEvents.getDestinationPipelineName(), equalTo(inputEvents.getDestinationPipelineName())); @@ -119,4 +119,4 @@ private PeerForwardingEvents generatePeerForwardingEvents(final int numEvents) { } return new PeerForwardingEvents(events, pluginId, pipelineName); } -} \ No newline at end of file +} diff --git a/data-prepper-core/src/test/java/org/opensearch/dataprepper/peerforwarder/discovery/DnsPeerListProviderTest.java b/data-prepper-core/src/test/java/org/opensearch/dataprepper/peerforwarder/discovery/DnsPeerListProviderTest.java index 1083eea9f0..3bdee15368 100644 --- a/data-prepper-core/src/test/java/org/opensearch/dataprepper/peerforwarder/discovery/DnsPeerListProviderTest.java +++ b/data-prepper-core/src/test/java/org/opensearch/dataprepper/peerforwarder/discovery/DnsPeerListProviderTest.java @@ -7,30 +7,33 @@ import com.linecorp.armeria.client.Endpoint; import com.linecorp.armeria.client.endpoint.dns.DnsAddressEndpointGroup; -import io.micrometer.core.instrument.Measurement; -import org.junit.Before; -import org.junit.Test; -import org.junit.runner.RunWith; +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.ArgumentCaptor; import org.mockito.Mock; -import org.mockito.junit.MockitoJUnitRunner; -import org.opensearch.dataprepper.metrics.MetricNames; -import org.opensearch.dataprepper.metrics.MetricsTestUtil; +import org.mockito.junit.jupiter.MockitoExtension; import org.opensearch.dataprepper.metrics.PluginMetrics; import org.opensearch.dataprepper.peerforwarder.HashRing; import java.util.Arrays; import java.util.Collections; import java.util.List; -import java.util.StringJoiner; import java.util.concurrent.CompletableFuture; - -import static org.junit.Assert.assertEquals; -import static org.junit.Assert.assertTrue; +import java.util.function.ToDoubleFunction; + +import static org.hamcrest.CoreMatchers.equalTo; +import static org.hamcrest.MatcherAssert.assertThat; +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertThrows; +import static org.junit.jupiter.api.Assertions.assertTrue; +import static org.mockito.ArgumentMatchers.eq; import static org.mockito.Mockito.mock; import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; +import static org.opensearch.dataprepper.peerforwarder.discovery.PeerListProvider.PEER_ENDPOINTS; -@RunWith(MockitoJUnitRunner.class) +@ExtendWith(MockitoExtension.class) public class DnsPeerListProviderTest { private static final String ENDPOINT_1 = "10.1.1.1"; @@ -39,8 +42,6 @@ public class DnsPeerListProviderTest { Endpoint.of(ENDPOINT_1), Endpoint.of(ENDPOINT_2) ); - private static final String COMPONENT_SCOPE = "testComponentScope"; - private static final String COMPONENT_ID = "testComponentId"; @Mock private DnsAddressEndpointGroup dnsAddressEndpointGroup; @@ -48,34 +49,33 @@ public class DnsPeerListProviderTest { @Mock private HashRing hashRing; + @Mock private PluginMetrics pluginMetrics; private CompletableFuture completableFuture; private DnsPeerListProvider dnsPeerListProvider; - @Before + @BeforeEach public void setup() { - MetricsTestUtil.initMetrics(); completableFuture = CompletableFuture.completedFuture(null); when(dnsAddressEndpointGroup.whenReady()).thenReturn(completableFuture); - pluginMetrics = PluginMetrics.fromNames(COMPONENT_ID, COMPONENT_SCOPE); dnsPeerListProvider = new DnsPeerListProvider(dnsAddressEndpointGroup, pluginMetrics); } - @Test(expected = NullPointerException.class) + @Test public void testDefaultListProviderWithNullHostname() { - new DnsPeerListProvider(null, pluginMetrics); + assertThrows(NullPointerException.class, () -> new DnsPeerListProvider(null, pluginMetrics)); } - @Test(expected = RuntimeException.class) + @Test public void testConstructWithInterruptedException() throws Exception { CompletableFuture mockFuture = mock(CompletableFuture.class); when(mockFuture.get()).thenThrow(new InterruptedException()); when(dnsAddressEndpointGroup.whenReady()).thenReturn(mockFuture); - new DnsPeerListProvider(dnsAddressEndpointGroup, pluginMetrics); + assertThrows(RuntimeException.class, () -> new DnsPeerListProvider(dnsAddressEndpointGroup, pluginMetrics)); } @Test @@ -90,17 +90,27 @@ public void testGetPeerList() { } @Test - public void testActivePeerCounter() { + public void testActivePeerCounter_with_list() { when(dnsAddressEndpointGroup.endpoints()).thenReturn(ENDPOINT_LIST); - final List endpointsMeasures = MetricsTestUtil.getMeasurementList(new StringJoiner(MetricNames.DELIMITER).add(COMPONENT_SCOPE).add(COMPONENT_ID) - .add(PeerListProvider.PEER_ENDPOINTS).toString()); - assertEquals(1, endpointsMeasures.size()); - final Measurement endpointsMeasure = endpointsMeasures.get(0); - assertEquals(2.0, endpointsMeasure.getValue(), 0); + final ArgumentCaptor> gaugeFunctionCaptor = ArgumentCaptor.forClass(ToDoubleFunction.class); + verify(pluginMetrics).gauge(eq(PEER_ENDPOINTS), eq(dnsAddressEndpointGroup), gaugeFunctionCaptor.capture()); + + final ToDoubleFunction gaugeFunction = gaugeFunctionCaptor.getValue(); + assertThat(gaugeFunction.applyAsDouble(dnsAddressEndpointGroup), equalTo(2.0)); + } + + @Test + public void testActivePeerCounter_with_single() { when(dnsAddressEndpointGroup.endpoints()).thenReturn(Collections.singletonList(Endpoint.of(ENDPOINT_1))); - assertEquals(1.0, endpointsMeasure.getValue(), 0); + + final ArgumentCaptor> gaugeFunctionCaptor = ArgumentCaptor.forClass(ToDoubleFunction.class); + verify(pluginMetrics).gauge(eq(PEER_ENDPOINTS), eq(dnsAddressEndpointGroup), gaugeFunctionCaptor.capture()); + + final ToDoubleFunction gaugeFunction = gaugeFunctionCaptor.getValue(); + + assertThat(gaugeFunction.applyAsDouble(dnsAddressEndpointGroup), equalTo(1.0)); } @Test diff --git a/data-prepper-core/src/test/java/org/opensearch/dataprepper/peerforwarder/discovery/StaticPeerListProviderTest.java b/data-prepper-core/src/test/java/org/opensearch/dataprepper/peerforwarder/discovery/StaticPeerListProviderTest.java index 14bc836e36..589329b108 100644 --- a/data-prepper-core/src/test/java/org/opensearch/dataprepper/peerforwarder/discovery/StaticPeerListProviderTest.java +++ b/data-prepper-core/src/test/java/org/opensearch/dataprepper/peerforwarder/discovery/StaticPeerListProviderTest.java @@ -5,56 +5,58 @@ package org.opensearch.dataprepper.peerforwarder.discovery; -import io.micrometer.core.instrument.Measurement; -import org.junit.Before; -import org.junit.Test; -import org.junit.runner.RunWith; +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.ArgumentCaptor; import org.mockito.Mock; -import org.mockito.junit.MockitoJUnitRunner; -import org.opensearch.dataprepper.metrics.MetricNames; -import org.opensearch.dataprepper.metrics.MetricsTestUtil; +import org.mockito.junit.jupiter.MockitoExtension; import org.opensearch.dataprepper.metrics.PluginMetrics; import org.opensearch.dataprepper.peerforwarder.HashRing; import java.util.Arrays; import java.util.Collections; import java.util.List; -import java.util.StringJoiner; - -import static org.junit.Assert.assertEquals; +import java.util.function.ToDoubleFunction; + +import static org.hamcrest.CoreMatchers.equalTo; +import static org.hamcrest.MatcherAssert.assertThat; +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertThrows; +import static org.mockito.ArgumentMatchers.any; +import static org.mockito.ArgumentMatchers.eq; +import static org.mockito.Mockito.verify; import static org.mockito.Mockito.verifyNoInteractions; +import static org.opensearch.dataprepper.peerforwarder.discovery.PeerListProvider.PEER_ENDPOINTS; -@RunWith(MockitoJUnitRunner.class) +@ExtendWith(MockitoExtension.class) public class StaticPeerListProviderTest { private static final String ENDPOINT_1 = "10.10.0.1"; private static final String ENDPOINT_2 = "10.10.0.2"; private static final List ENDPOINT_LIST = Arrays.asList(ENDPOINT_1, ENDPOINT_2); - private static final String COMPONENT_SCOPE = "testComponentScope"; - private static final String COMPONENT_ID = "testComponentId"; @Mock private HashRing hashRing; + @Mock private PluginMetrics pluginMetrics; private StaticPeerListProvider staticPeerListProvider; - @Before + @BeforeEach public void setup() { - MetricsTestUtil.initMetrics(); - pluginMetrics = PluginMetrics.fromNames(COMPONENT_ID, COMPONENT_SCOPE); staticPeerListProvider = new StaticPeerListProvider(ENDPOINT_LIST, pluginMetrics); } - @Test(expected = RuntimeException.class) + @Test public void testListProviderWithEmptyList() { - new StaticPeerListProvider(Collections.emptyList(), pluginMetrics); + assertThrows(RuntimeException.class, () -> new StaticPeerListProvider(Collections.emptyList(), pluginMetrics)); } - @Test(expected = RuntimeException.class) + @Test public void testListProviderWithNullList() { - new StaticPeerListProvider(null, pluginMetrics); + assertThrows(RuntimeException.class, () -> new StaticPeerListProvider(null, pluginMetrics)); } @Test @@ -65,11 +67,12 @@ public void testListProviderWithNonEmptyList() { @Test public void testActivePeerCounter() { - final List endpointsMeasures = MetricsTestUtil.getMeasurementList( - new StringJoiner(MetricNames.DELIMITER).add(COMPONENT_SCOPE).add(COMPONENT_ID).add(PeerListProvider.PEER_ENDPOINTS).toString()); - assertEquals(1, endpointsMeasures.size()); - final Measurement endpointsMeasure = endpointsMeasures.get(0); - assertEquals(2.0, endpointsMeasure.getValue(), 0); + final ArgumentCaptor>> gaugeFunctionCaptor = ArgumentCaptor.forClass(ToDoubleFunction.class); + verify(pluginMetrics).gauge(eq(PEER_ENDPOINTS), any(List.class), gaugeFunctionCaptor.capture()); + + final ToDoubleFunction> gaugeFunction = gaugeFunctionCaptor.getValue(); + + assertThat(gaugeFunction.applyAsDouble(ENDPOINT_LIST), equalTo(2.0)); } @Test diff --git a/data-prepper-core/src/test/java/org/opensearch/dataprepper/pipeline/PipelineConnectorTest.java b/data-prepper-core/src/test/java/org/opensearch/dataprepper/pipeline/PipelineConnectorTest.java index fb54d532b7..e2af218c25 100644 --- a/data-prepper-core/src/test/java/org/opensearch/dataprepper/pipeline/PipelineConnectorTest.java +++ b/data-prepper-core/src/test/java/org/opensearch/dataprepper/pipeline/PipelineConnectorTest.java @@ -23,7 +23,7 @@ import org.junit.Test; import org.junit.runner.RunWith; import org.mockito.Mock; -import org.mockito.runners.MockitoJUnitRunner; +import org.mockito.junit.MockitoJUnitRunner; import org.opensearch.dataprepper.plugins.buffer.blockingbuffer.BlockingBuffer; diff --git a/data-prepper-core/src/test/java/org/opensearch/dataprepper/pipeline/PipelineTests.java b/data-prepper-core/src/test/java/org/opensearch/dataprepper/pipeline/PipelineTests.java index 5c0a9a974e..c2e0ad769f 100644 --- a/data-prepper-core/src/test/java/org/opensearch/dataprepper/pipeline/PipelineTests.java +++ b/data-prepper-core/src/test/java/org/opensearch/dataprepper/pipeline/PipelineTests.java @@ -441,7 +441,7 @@ void publishToSinks_calls_route_with_Events_and_Sinks_verify_AcknowledgementSetM Pipeline pipeline = createObjectUnderTest(); when(mockSource.areAcknowledgementsEnabled()).thenReturn(true); pipeline.publishToSinks(records); - verify(acknowledgementSetManager).acquireEventReference(any(DefaultEventHandle.class)); + verify(eventHandle).acquireReference(); verify(router) .route(anyCollection(), eq(dataFlowComponents), any(RouterGetRecordStrategy.class), any(BiConsumer.class)); diff --git a/data-prepper-core/src/test/java/org/opensearch/dataprepper/pipeline/ProcessWorkerTest.java b/data-prepper-core/src/test/java/org/opensearch/dataprepper/pipeline/ProcessWorkerTest.java index 3d13c0d49f..455da07a93 100644 --- a/data-prepper-core/src/test/java/org/opensearch/dataprepper/pipeline/ProcessWorkerTest.java +++ b/data-prepper-core/src/test/java/org/opensearch/dataprepper/pipeline/ProcessWorkerTest.java @@ -12,6 +12,7 @@ import org.opensearch.dataprepper.model.event.DefaultEventHandle; import org.opensearch.dataprepper.model.event.Event; import org.opensearch.dataprepper.model.event.EventHandle; +import org.opensearch.dataprepper.model.event.InternalEventHandle; import org.opensearch.dataprepper.model.processor.Processor; import org.opensearch.dataprepper.model.record.Record; import org.opensearch.dataprepper.model.source.Source; @@ -27,7 +28,6 @@ import java.util.concurrent.Future; import static org.mockito.ArgumentMatchers.any; -import static org.mockito.Mockito.doNothing; import static org.mockito.Mockito.mock; import static org.mockito.Mockito.mockStatic; import static org.mockito.Mockito.never; @@ -104,7 +104,6 @@ void testProcessWorkerHappyPathWithAcknowledgments() { final Record mockRecord = mock(Record.class); final Event mockEvent = mock(Event.class); final EventHandle eventHandle = mock(DefaultEventHandle.class); - when(((DefaultEventHandle) eventHandle).getAcknowledgementSet()).thenReturn(mock(AcknowledgementSet.class)); when(mockRecord.getData()).thenReturn(mockEvent); when(mockEvent.getEventHandle()).thenReturn(eventHandle); @@ -174,8 +173,8 @@ void testProcessWorkerWithProcessorThrowingExceptionAndAcknowledgmentsEnabledIsH final Record mockRecord = mock(Record.class); final Event mockEvent = mock(Event.class); final EventHandle eventHandle = mock(DefaultEventHandle.class); - when(((DefaultEventHandle) eventHandle).getAcknowledgementSet()).thenReturn(mock(AcknowledgementSet.class)); - doNothing().when(eventHandle).release(true); + final AcknowledgementSet acknowledgementSet = mock(AcknowledgementSet.class); + ((InternalEventHandle)eventHandle).addAcknowledgementSet(acknowledgementSet); when(mockRecord.getData()).thenReturn(mockEvent); when(mockEvent.getEventHandle()).thenReturn(eventHandle); @@ -218,8 +217,8 @@ void testProcessWorkerWithProcessorDroppingAllRecordsAndAcknowledgmentsEnabledIs final Record mockRecord = mock(Record.class); final Event mockEvent = mock(Event.class); final EventHandle eventHandle = mock(DefaultEventHandle.class); - when(((DefaultEventHandle) eventHandle).getAcknowledgementSet()).thenReturn(mock(AcknowledgementSet.class)); - doNothing().when(eventHandle).release(true); + final AcknowledgementSet acknowledgementSet = mock(AcknowledgementSet.class); + ((InternalEventHandle)eventHandle).addAcknowledgementSet(acknowledgementSet); when(mockRecord.getData()).thenReturn(mockEvent); when(mockEvent.getEventHandle()).thenReturn(eventHandle); diff --git a/data-prepper-core/src/test/java/org/opensearch/dataprepper/pipeline/common/FutureHelperTest.java b/data-prepper-core/src/test/java/org/opensearch/dataprepper/pipeline/common/FutureHelperTest.java index c572766ac2..ba8a9714de 100644 --- a/data-prepper-core/src/test/java/org/opensearch/dataprepper/pipeline/common/FutureHelperTest.java +++ b/data-prepper-core/src/test/java/org/opensearch/dataprepper/pipeline/common/FutureHelperTest.java @@ -9,7 +9,7 @@ import org.junit.Test; import org.junit.runner.RunWith; import org.mockito.Mock; -import org.mockito.runners.MockitoJUnitRunner; +import org.mockito.junit.MockitoJUnitRunner; import java.util.Arrays; import java.util.concurrent.ExecutionException; diff --git a/data-prepper-core/src/test/java/org/opensearch/dataprepper/pipeline/router/DataFlowComponentRouterTest.java b/data-prepper-core/src/test/java/org/opensearch/dataprepper/pipeline/router/DataFlowComponentRouterTest.java index 3802356592..1ea74afe70 100644 --- a/data-prepper-core/src/test/java/org/opensearch/dataprepper/pipeline/router/DataFlowComponentRouterTest.java +++ b/data-prepper-core/src/test/java/org/opensearch/dataprepper/pipeline/router/DataFlowComponentRouterTest.java @@ -158,6 +158,17 @@ void route_no_Events_when_none_have_matching_routes() { verify(componentRecordsConsumer).accept(testComponent, Collections.emptyList()); } + @Test + void route_no_Events_when_none_have_matching_routes_with_default_route() { + when(dataFlowComponent.getRoutes()).thenReturn(Set.of(DataFlowComponentRouter.DEFAULT_ROUTE)); + final Map> noMatchingRoutes = recordsIn.stream() + .collect(Collectors.toMap(Function.identity(), r -> Collections.emptySet())); + + createObjectUnderTest().route(recordsIn, dataFlowComponent, noMatchingRoutes, getRecordStrategy, componentRecordsConsumer); + + verify(componentRecordsConsumer).accept(testComponent, recordsIn); + } + @Test void route_all_Events_when_all_have_matched_route() { @@ -236,6 +247,33 @@ void route_no_Events_when_none_have_matching_routes() { verify(componentRecordsConsumer).accept(testComponent, Collections.emptyList()); } + @Test + void route_no_Events_when_none_have_matching_routes_with_default_route() { + when(dataFlowComponent.getRoutes()).thenReturn(Set.of(DataFlowComponentRouter.DEFAULT_ROUTE)); + final Map> noMatchingRoutes = recordsIn.stream() + .collect(Collectors.toMap(Function.identity(), r -> Collections.emptySet())); + + createObjectUnderTest().route(recordsIn, dataFlowComponent, noMatchingRoutes, getRecordStrategy, componentRecordsConsumer); + + verify(componentRecordsConsumer).accept(testComponent, recordsIn); + } + + @Test + void route_matched_events_with_none_to_default_route() { + DataFlowComponent dataFlowComponent2 = mock(DataFlowComponent.class); + when(dataFlowComponent2.getRoutes()).thenReturn(Set.of(DataFlowComponentRouter.DEFAULT_ROUTE)); + final Map> allMatchingRoutes = recordsIn.stream() + .collect(Collectors.toMap(Function.identity(), r -> Collections.singleton(knownRoute))); + + createObjectUnderTest().route(recordsIn, dataFlowComponent2, allMatchingRoutes, getRecordStrategy, componentRecordsConsumer); + verify(componentRecordsConsumer).accept(null, Collections.emptyList()); + createObjectUnderTest().route(recordsIn, dataFlowComponent, allMatchingRoutes, getRecordStrategy, componentRecordsConsumer); + + verify(componentRecordsConsumer).accept(testComponent, recordsIn); + + } + + @Test void route_all_Events_when_all_have_matched_route() { diff --git a/data-prepper-core/src/test/java/org/opensearch/dataprepper/pipeline/router/RouterCopyRecordStrategyTests.java b/data-prepper-core/src/test/java/org/opensearch/dataprepper/pipeline/router/RouterCopyRecordStrategyTests.java index 4c56113323..c971cd5b8d 100644 --- a/data-prepper-core/src/test/java/org/opensearch/dataprepper/pipeline/router/RouterCopyRecordStrategyTests.java +++ b/data-prepper-core/src/test/java/org/opensearch/dataprepper/pipeline/router/RouterCopyRecordStrategyTests.java @@ -83,7 +83,7 @@ void setUp() { int v = handleRefCount.getOrDefault(handle, 0); handleRefCount.put(handle, v+1); return null; - }).when(acknowledgementSetManager).acquireEventReference(any(DefaultEventHandle.class)); + }).when(acknowledgementSet1).acquire(any(DefaultEventHandle.class)); } catch (Exception e){} mockRecordsIn = IntStream.range(0, 10) .mapToObj(i -> mock(Record.class)) @@ -103,7 +103,7 @@ private void attachEventHandlesToRecordsIn(List eventHandles while (iter.hasNext()) { Record r = (Record) iter.next(); DefaultEventHandle handle = (DefaultEventHandle)((JacksonEvent)r.getData()).getEventHandle(); - handle.setAcknowledgementSet(acknowledgementSet1); + handle.addAcknowledgementSet(acknowledgementSet1); eventHandles.add(handle); } } @@ -195,6 +195,7 @@ void test_one_record_with_acknowledgements() { assertTrue(getRecordStrategy.getReferencedRecords().contains(firstRecord)); recordOut = getRecordStrategy.getRecord(firstRecord); assertThat(recordOut, sameInstance(firstRecord)); + firstHandle.addAcknowledgementSet(acknowledgementSet1); assertThat(handleRefCount.get(firstHandle), equalTo(1)); recordOut = getRecordStrategy.getRecord(firstRecord); assertThat(recordOut, sameInstance(firstRecord)); @@ -242,7 +243,7 @@ void test_one_record_with_acknowledgements_and_multi_components() { try { doAnswer((i) -> { JacksonEvent e1 = (JacksonEvent) i.getArgument(0); - ((DefaultEventHandle)e1.getEventHandle()).setAcknowledgementSet(acknowledgementSet1); + ((DefaultEventHandle)e1.getEventHandle()).addAcknowledgementSet(acknowledgementSet1); return null; }).when(acknowledgementSet1).add(any(JacksonEvent.class)); } catch (Exception e){} @@ -280,7 +281,7 @@ void test_multiple_records_with_acknowledgements_and_multi_components() { try { doAnswer((i) -> { JacksonEvent e1 = (JacksonEvent) i.getArgument(0); - ((DefaultEventHandle)e1.getEventHandle()).setAcknowledgementSet(acknowledgementSet1); + ((DefaultEventHandle)e1.getEventHandle()).addAcknowledgementSet(acknowledgementSet1); return null; }).when(acknowledgementSet1).add(any(JacksonEvent.class)); } catch (Exception e){} diff --git a/data-prepper-core/src/test/java/org/opensearch/dataprepper/pipeline/server/CloudWatchMeterRegistryProviderTest.java b/data-prepper-core/src/test/java/org/opensearch/dataprepper/pipeline/server/CloudWatchMeterRegistryProviderTest.java index 53db40d1a6..9dc744981b 100644 --- a/data-prepper-core/src/test/java/org/opensearch/dataprepper/pipeline/server/CloudWatchMeterRegistryProviderTest.java +++ b/data-prepper-core/src/test/java/org/opensearch/dataprepper/pipeline/server/CloudWatchMeterRegistryProviderTest.java @@ -9,7 +9,7 @@ import org.junit.Test; import org.junit.runner.RunWith; import org.mockito.Mock; -import org.mockito.runners.MockitoJUnitRunner; +import org.mockito.junit.MockitoJUnitRunner; import software.amazon.awssdk.services.cloudwatch.CloudWatchAsyncClient; import static org.hamcrest.CoreMatchers.notNullValue; diff --git a/data-prepper-core/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker b/data-prepper-core/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker deleted file mode 100644 index 23c33feb6d..0000000000 --- a/data-prepper-core/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker +++ /dev/null @@ -1,3 +0,0 @@ -# To enable mocking of final classes with vanilla Mockito -# https://github.com/mockito/mockito/wiki/What%27s-new-in-Mockito-2#mock-the-unmockable-opt-in-mocking-of-final-classesmethods -mock-maker-inline diff --git a/data-prepper-event/src/main/java/org/opensearch/dataprepper/core/event/DefaultEventKeyFactory.java b/data-prepper-event/src/main/java/org/opensearch/dataprepper/core/event/DefaultEventKeyFactory.java new file mode 100644 index 0000000000..605b5bcb41 --- /dev/null +++ b/data-prepper-event/src/main/java/org/opensearch/dataprepper/core/event/DefaultEventKeyFactory.java @@ -0,0 +1,20 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.core.event; + +import org.opensearch.dataprepper.model.event.EventKey; +import org.opensearch.dataprepper.model.event.EventKeyFactory; +import org.opensearch.dataprepper.model.event.InternalOnlyEventKeyBridge; + +import javax.inject.Named; + +@Named +public class DefaultEventKeyFactory implements EventKeyFactory { + @Override + public EventKey createEventKey(final String key, final EventAction... forActions) { + return InternalOnlyEventKeyBridge.createEventKey(key, forActions); + } +} diff --git a/data-prepper-event/src/main/java/org/opensearch/dataprepper/model/event/InternalOnlyEventKeyBridge.java b/data-prepper-event/src/main/java/org/opensearch/dataprepper/model/event/InternalOnlyEventKeyBridge.java new file mode 100644 index 0000000000..130b94db0e --- /dev/null +++ b/data-prepper-event/src/main/java/org/opensearch/dataprepper/model/event/InternalOnlyEventKeyBridge.java @@ -0,0 +1,17 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.model.event; + +/** + * Until we remove {@link JacksonEvent} from data-prepper-api, + * we will need this class to give us access to the package-protected + * {@link JacksonEventKey}. + */ +public class InternalOnlyEventKeyBridge { + public static EventKey createEventKey(final String key, final EventKeyFactory.EventAction... forAction) { + return new JacksonEventKey(key, forAction); + } +} diff --git a/data-prepper-event/src/test/java/org/opensearch/dataprepper/core/event/DefaultEventKeyFactoryTest.java b/data-prepper-event/src/test/java/org/opensearch/dataprepper/core/event/DefaultEventKeyFactoryTest.java new file mode 100644 index 0000000000..8d034fcc83 --- /dev/null +++ b/data-prepper-event/src/test/java/org/opensearch/dataprepper/core/event/DefaultEventKeyFactoryTest.java @@ -0,0 +1,52 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.core.event; + +import org.junit.jupiter.api.Test; +import org.opensearch.dataprepper.model.event.EventKey; +import org.opensearch.dataprepper.model.event.EventKeyFactory; + +import java.util.UUID; + +import static org.hamcrest.CoreMatchers.equalTo; +import static org.hamcrest.CoreMatchers.notNullValue; +import static org.hamcrest.MatcherAssert.assertThat; + +class DefaultEventKeyFactoryTest { + + private DefaultEventKeyFactory createObjectUnderTest() { + return new DefaultEventKeyFactory(); + } + + @Test + void createEventKey_returns_correct_EventKey() { + final String keyPath = UUID.randomUUID().toString(); + final EventKey eventKey = createObjectUnderTest().createEventKey(keyPath); + + assertThat(eventKey, notNullValue()); + assertThat(eventKey.getKey(), equalTo(keyPath)); + } + + @Test + void createEventKey_with_EventAction_returns_correct_EventKey() { + final String keyPath = UUID.randomUUID().toString(); + final EventKey eventKey = createObjectUnderTest().createEventKey(keyPath, EventKeyFactory.EventAction.GET); + + assertThat(eventKey, notNullValue()); + assertThat(eventKey.getKey(), equalTo(keyPath)); + } + + @Test + void createEventKey_returns_JacksonEventKey() { + final String keyPath = UUID.randomUUID().toString(); + final EventKey eventKey = createObjectUnderTest().createEventKey(keyPath); + + assertThat(eventKey, notNullValue()); + assertThat(eventKey.getClass().getSimpleName(), equalTo("JacksonEventKey")); + + assertThat(eventKey.getKey(), equalTo(keyPath)); + } +} \ No newline at end of file diff --git a/data-prepper-expression/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker b/data-prepper-expression/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker deleted file mode 100644 index 1f0955d450..0000000000 --- a/data-prepper-expression/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker +++ /dev/null @@ -1 +0,0 @@ -mock-maker-inline diff --git a/data-prepper-logstash-configuration/build.gradle b/data-prepper-logstash-configuration/build.gradle index 6e328b7adc..002ae15516 100644 --- a/data-prepper-logstash-configuration/build.gradle +++ b/data-prepper-logstash-configuration/build.gradle @@ -25,7 +25,6 @@ dependencies { implementation 'com.fasterxml.jackson.core:jackson-databind' implementation libs.commons.lang3 testImplementation testLibs.slf4j.simple - testImplementation testLibs.mockito.inline } generateGrammarSource { diff --git a/data-prepper-pipeline-parser/build.gradle b/data-prepper-pipeline-parser/build.gradle index 09c89eb15c..a94f63fc1d 100644 --- a/data-prepper-pipeline-parser/build.gradle +++ b/data-prepper-pipeline-parser/build.gradle @@ -18,6 +18,7 @@ dependencies { implementation 'org.projectlombok:lombok:1.18.22' implementation 'com.jayway.jsonpath:json-path:2.6.0' implementation 'javax.inject:javax.inject:1' + implementation 'javax.annotation:javax.annotation-api:1.3.2' implementation(libs.spring.core) { exclude group: 'commons-logging', module: 'commons-logging' } @@ -29,12 +30,7 @@ dependencies { testImplementation testLibs.bundles.junit testImplementation testLibs.bundles.mockito testImplementation testLibs.hamcrest - testImplementation 'org.powermock:powermock-module-junit4:2.0.9' - testImplementation 'org.powermock:powermock-api-mockito2:2.0.9' testImplementation 'org.assertj:assertj-core:3.20.2' - testImplementation 'junit:junit:4.13.2' - testImplementation 'org.powermock:powermock-module-junit4:2.0.9' - testImplementation 'org.powermock:powermock-api-mockito2:2.0.9' compileOnly 'org.projectlombok:lombok:1.18.20' annotationProcessor 'org.projectlombok:lombok:1.18.20' } \ No newline at end of file diff --git a/data-prepper-pipeline-parser/src/main/java/org/opensearch/dataprepper/core/validators/NotEmptyValidatorForEventKey.java b/data-prepper-pipeline-parser/src/main/java/org/opensearch/dataprepper/core/validators/NotEmptyValidatorForEventKey.java new file mode 100644 index 0000000000..507d2e9637 --- /dev/null +++ b/data-prepper-pipeline-parser/src/main/java/org/opensearch/dataprepper/core/validators/NotEmptyValidatorForEventKey.java @@ -0,0 +1,22 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.core.validators; + +import jakarta.validation.ConstraintValidator; +import jakarta.validation.ConstraintValidatorContext; +import org.opensearch.dataprepper.model.event.EventKey; + +import jakarta.validation.constraints.NotEmpty; + +public class NotEmptyValidatorForEventKey implements ConstraintValidator { + @Override + public boolean isValid(final EventKey eventKey, final ConstraintValidatorContext constraintValidatorContext) { + if(eventKey == null) { + return false; + } + return !eventKey.getKey().isEmpty(); + } +} diff --git a/data-prepper-pipeline-parser/src/main/java/org/opensearch/dataprepper/pipeline/parser/EventKeyDeserializer.java b/data-prepper-pipeline-parser/src/main/java/org/opensearch/dataprepper/pipeline/parser/EventKeyDeserializer.java new file mode 100644 index 0000000000..fbc27edc8b --- /dev/null +++ b/data-prepper-pipeline-parser/src/main/java/org/opensearch/dataprepper/pipeline/parser/EventKeyDeserializer.java @@ -0,0 +1,60 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.pipeline.parser; + +import com.fasterxml.jackson.core.JsonParser; +import com.fasterxml.jackson.databind.BeanProperty; +import com.fasterxml.jackson.databind.DeserializationContext; +import com.fasterxml.jackson.databind.JsonDeserializer; +import com.fasterxml.jackson.databind.deser.ContextualDeserializer; +import com.fasterxml.jackson.databind.deser.std.StdDeserializer; +import org.opensearch.dataprepper.model.event.EventKey; +import org.opensearch.dataprepper.model.event.EventKeyConfiguration; +import org.opensearch.dataprepper.model.event.EventKeyFactory; + +import java.io.IOException; + +public class EventKeyDeserializer extends StdDeserializer implements ContextualDeserializer { + private final EventKeyFactory eventKeyFactory; + private final EventKeyFactory.EventAction[] eventAction; + + /** + * Constructs a new {@link EventKeyDeserializer} from an {@link EventKeyFactory}. + * + * @param eventKeyFactory The factory for creating {@link EventKey} objects. + */ + public EventKeyDeserializer(final EventKeyFactory eventKeyFactory) { + this(eventKeyFactory, new EventKeyFactory.EventAction[] {EventKeyFactory.EventAction.ALL}); + } + + private EventKeyDeserializer(final EventKeyFactory eventKeyFactory, final EventKeyFactory.EventAction[] eventAction) { + super(EventKey.class); + this.eventKeyFactory = eventKeyFactory; + this.eventAction = eventAction; + } + + @Override + public EventKey deserialize(final JsonParser parser, final DeserializationContext ctxt) throws IOException { + final String eventKeyString = parser.getValueAsString(); + + return eventKeyFactory.createEventKey(eventKeyString, eventAction); + } + + @Override + public JsonDeserializer createContextual(final DeserializationContext deserializationContext, final BeanProperty property) { + if(property == null) + return this; + + final EventKeyConfiguration eventKeyConfiguration = property.getAnnotation(EventKeyConfiguration.class); + + if(eventKeyConfiguration == null) + return this; + + final EventKeyFactory.EventAction[] eventAction = eventKeyConfiguration.value(); + + return new EventKeyDeserializer(eventKeyFactory, eventAction); + } +} diff --git a/data-prepper-pipeline-parser/src/main/resources/META-INF/services/jakarta.validation.ConstraintValidator b/data-prepper-pipeline-parser/src/main/resources/META-INF/services/jakarta.validation.ConstraintValidator new file mode 100644 index 0000000000..ab6fb40c08 --- /dev/null +++ b/data-prepper-pipeline-parser/src/main/resources/META-INF/services/jakarta.validation.ConstraintValidator @@ -0,0 +1,6 @@ +# +# Copyright OpenSearch Contributors +# SPDX-License-Identifier: Apache-2.0 +# + +org.opensearch.dataprepper.core.validators.NotEmptyValidatorForEventKey \ No newline at end of file diff --git a/data-prepper-pipeline-parser/src/test/java/org/opensearch/dataprepper/core/validators/NotEmptyValidatorForEventKeyTest.java b/data-prepper-pipeline-parser/src/test/java/org/opensearch/dataprepper/core/validators/NotEmptyValidatorForEventKeyTest.java new file mode 100644 index 0000000000..d49ca2c161 --- /dev/null +++ b/data-prepper-pipeline-parser/src/test/java/org/opensearch/dataprepper/core/validators/NotEmptyValidatorForEventKeyTest.java @@ -0,0 +1,50 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.core.validators; + +import jakarta.validation.ConstraintValidatorContext; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.junit.jupiter.params.ParameterizedTest; +import org.junit.jupiter.params.provider.ValueSource; +import org.mockito.Mock; +import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.dataprepper.model.event.EventKey; + +import static org.hamcrest.CoreMatchers.equalTo; +import static org.hamcrest.MatcherAssert.assertThat; +import static org.mockito.Mockito.when; + +@ExtendWith(MockitoExtension.class) +class NotEmptyValidatorForEventKeyTest { + @Mock + private EventKey eventKey; + + @Mock + private ConstraintValidatorContext context; + + private NotEmptyValidatorForEventKey createObjectUnderTest() { + return new NotEmptyValidatorForEventKey(); + } + + @Test + void isValid_returns_false_if_EventKey_is_empty() { + assertThat(createObjectUnderTest().isValid(null, context), equalTo(false)); + } + + @Test + void isValid_returns_false_if_EventKey_getKey_is_empty() { + when(eventKey.getKey()).thenReturn(""); + assertThat(createObjectUnderTest().isValid(eventKey, context), equalTo(false)); + } + + @ParameterizedTest + @ValueSource(strings = {"/", "a", "/abcdefghijklmnopqrstuvwxyz"}) + void isValid_returns_true_if_EventKey_getKey_is_not_empty(final String key) { + when(eventKey.getKey()).thenReturn(key); + assertThat(createObjectUnderTest().isValid(eventKey, context), equalTo(true)); + } +} \ No newline at end of file diff --git a/data-prepper-pipeline-parser/src/test/java/org/opensearch/dataprepper/pipeline/parser/EventKeyDeserializerTest.java b/data-prepper-pipeline-parser/src/test/java/org/opensearch/dataprepper/pipeline/parser/EventKeyDeserializerTest.java new file mode 100644 index 0000000000..240c14dd37 --- /dev/null +++ b/data-prepper-pipeline-parser/src/test/java/org/opensearch/dataprepper/pipeline/parser/EventKeyDeserializerTest.java @@ -0,0 +1,142 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.pipeline.parser; + +import com.fasterxml.jackson.core.JsonParser; +import com.fasterxml.jackson.databind.BeanProperty; +import com.fasterxml.jackson.databind.DeserializationContext; +import com.fasterxml.jackson.databind.JsonDeserializer; +import com.fasterxml.jackson.databind.ObjectMapper; +import com.fasterxml.jackson.databind.module.SimpleModule; +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Nested; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.junit.jupiter.params.ParameterizedTest; +import org.junit.jupiter.params.provider.EnumSource; +import org.mockito.Mock; +import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.dataprepper.model.event.EventKey; +import org.opensearch.dataprepper.model.event.EventKeyConfiguration; +import org.opensearch.dataprepper.model.event.EventKeyFactory; + +import java.io.IOException; +import java.util.UUID; + +import static org.hamcrest.CoreMatchers.equalTo; +import static org.hamcrest.CoreMatchers.notNullValue; +import static org.hamcrest.CoreMatchers.sameInstance; +import static org.hamcrest.MatcherAssert.assertThat; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.when; + +@ExtendWith(MockitoExtension.class) +class EventKeyDeserializerTest { + + @Mock + private EventKeyFactory eventKeyFactory; + + @Mock + private DeserializationContext deserializationContext; + @Mock + private BeanProperty property; + @Mock(lenient = true) + private JsonParser parser; + @Mock + private EventKey eventKey; + + private String eventKeyString; + + @BeforeEach + void setUp() throws IOException { + eventKeyString = UUID.randomUUID().toString(); + + when(parser.getValueAsString()).thenReturn(eventKeyString); + } + + private EventKeyDeserializer createObjectUnderTest() { + return new EventKeyDeserializer(eventKeyFactory); + } + + @Test + void createContextual_returns_EventKeyDeserializer_that_deserializes_with_ALL_when_no_BeanProperty() throws IOException { + when(eventKeyFactory.createEventKey(eventKeyString, EventKeyFactory.EventAction.ALL)).thenReturn(eventKey); + final JsonDeserializer contextualDeserializer = createObjectUnderTest().createContextual(deserializationContext, null); + assertThat(contextualDeserializer, notNullValue()); + assertThat(contextualDeserializer.deserialize(parser, deserializationContext), equalTo(eventKey)); + } + + @Test + void createContextual_returns_EventKeyDeserializer_that_deserializes_with_ALL_when_no_annotation() throws IOException { + when(eventKeyFactory.createEventKey(eventKeyString, EventKeyFactory.EventAction.ALL)).thenReturn(eventKey); + final JsonDeserializer contextualDeserializer = createObjectUnderTest().createContextual(deserializationContext, property); + assertThat(contextualDeserializer, notNullValue()); + assertThat(contextualDeserializer.deserialize(parser, deserializationContext), equalTo(eventKey)); + } + + @Test + void createContextual_returns_same_EventKeyDeserializer_as_self_when_no_BeanProperty() { + final EventKeyDeserializer objectUnderTest = createObjectUnderTest(); + final JsonDeserializer contextualDeserializer = objectUnderTest.createContextual(deserializationContext, null); + assertThat(contextualDeserializer, sameInstance(objectUnderTest)); + } + + @Test + void createContextual_returns_same_EventKeyDeserializer_as_self_when_no_annotation() { + final EventKeyDeserializer objectUnderTest = createObjectUnderTest(); + final JsonDeserializer contextualDeserializer = objectUnderTest.createContextual(deserializationContext, property); + assertThat(contextualDeserializer, sameInstance(objectUnderTest)); + } + + @ParameterizedTest + @EnumSource(value = EventKeyFactory.EventAction.class) + void createContextual_returns_EventKeyDeserializer_that_deserializes_with_action_from_annotated_Event(final EventKeyFactory.EventAction eventAction) throws IOException { + final EventKeyConfiguration eventKeyConfiguration = mock(EventKeyConfiguration.class); + when(eventKeyConfiguration.value()).thenReturn(new EventKeyFactory.EventAction[] { eventAction }); + when(property.getAnnotation(EventKeyConfiguration.class)).thenReturn(eventKeyConfiguration); + when(eventKeyFactory.createEventKey(eventKeyString, eventAction)).thenReturn(eventKey); + + final JsonDeserializer contextualDeserializer = createObjectUnderTest().createContextual(deserializationContext, property); + + assertThat(contextualDeserializer, notNullValue()); + assertThat(contextualDeserializer.deserialize(parser, deserializationContext), equalTo(eventKey)); + } + + @Test + void createContextual_returns_EventKeyDeserializer_that_deserializes_with_action_from_annotated_Event_when_multiple() throws IOException { + final EventKeyConfiguration eventKeyConfiguration = mock(EventKeyConfiguration.class); + when(eventKeyConfiguration.value()).thenReturn(new EventKeyFactory.EventAction[] { EventKeyFactory.EventAction.PUT, EventKeyFactory.EventAction.DELETE }); + when(property.getAnnotation(EventKeyConfiguration.class)).thenReturn(eventKeyConfiguration); + when(eventKeyFactory.createEventKey(eventKeyString, EventKeyFactory.EventAction.PUT, EventKeyFactory.EventAction.DELETE)).thenReturn(eventKey); + + final JsonDeserializer contextualDeserializer = createObjectUnderTest().createContextual(deserializationContext, property); + + assertThat(contextualDeserializer, notNullValue()); + assertThat(contextualDeserializer.deserialize(parser, deserializationContext), equalTo(eventKey)); + } + + @Nested + class UsingRealObjectMapper { + private ObjectMapper objectMapper; + + @BeforeEach + void setUp() { + objectMapper = new ObjectMapper(); + + final SimpleModule simpleModule = new SimpleModule(); + simpleModule.addDeserializer(EventKey.class, createObjectUnderTest()); + objectMapper.registerModule(simpleModule); + } + + @Test + void quick() { + when(eventKeyFactory.createEventKey(eventKeyString, EventKeyFactory.EventAction.ALL)).thenReturn(eventKey); + + assertThat(objectMapper.convertValue(eventKeyString, EventKey.class), + equalTo(eventKey)); + } + } +} \ No newline at end of file diff --git a/data-prepper-plugin-framework/build.gradle b/data-prepper-plugin-framework/build.gradle index f77212a6b2..14f03fe15d 100644 --- a/data-prepper-plugin-framework/build.gradle +++ b/data-prepper-plugin-framework/build.gradle @@ -24,5 +24,4 @@ dependencies { } implementation libs.reflections.core implementation 'com.fasterxml.jackson.core:jackson-databind' - testImplementation testLibs.mockito.inline } \ No newline at end of file diff --git a/data-prepper-plugin-framework/src/main/java/org/opensearch/dataprepper/plugin/ApplicationContextToTypedSuppliers.java b/data-prepper-plugin-framework/src/main/java/org/opensearch/dataprepper/plugin/ApplicationContextToTypedSuppliers.java index f5ceebbde6..f9e1abddb7 100644 --- a/data-prepper-plugin-framework/src/main/java/org/opensearch/dataprepper/plugin/ApplicationContextToTypedSuppliers.java +++ b/data-prepper-plugin-framework/src/main/java/org/opensearch/dataprepper/plugin/ApplicationContextToTypedSuppliers.java @@ -8,6 +8,7 @@ import org.opensearch.dataprepper.model.acknowledgements.AcknowledgementSetManager; import org.opensearch.dataprepper.model.breaker.CircuitBreaker; import org.opensearch.dataprepper.model.event.EventFactory; +import org.opensearch.dataprepper.model.event.EventKeyFactory; import org.springframework.beans.factory.annotation.Autowired; import javax.inject.Inject; @@ -31,6 +32,7 @@ class ApplicationContextToTypedSuppliers { @Inject ApplicationContextToTypedSuppliers( final EventFactory eventFactory, + final EventKeyFactory eventKeyFactory, final AcknowledgementSetManager acknowledgementSetManager, @Autowired(required = false) final CircuitBreaker circuitBreaker ) { @@ -39,6 +41,7 @@ class ApplicationContextToTypedSuppliers { typedSuppliers = Map.of( EventFactory.class, () -> eventFactory, + EventKeyFactory.class, () -> eventKeyFactory, AcknowledgementSetManager.class, () -> acknowledgementSetManager, CircuitBreaker.class, () -> circuitBreaker ); diff --git a/data-prepper-plugin-framework/src/main/java/org/opensearch/dataprepper/plugin/ObjectMapperConfiguration.java b/data-prepper-plugin-framework/src/main/java/org/opensearch/dataprepper/plugin/ObjectMapperConfiguration.java index 5865d5b29a..ca2cea4ee8 100644 --- a/data-prepper-plugin-framework/src/main/java/org/opensearch/dataprepper/plugin/ObjectMapperConfiguration.java +++ b/data-prepper-plugin-framework/src/main/java/org/opensearch/dataprepper/plugin/ObjectMapperConfiguration.java @@ -8,9 +8,12 @@ import com.fasterxml.jackson.databind.ObjectMapper; import com.fasterxml.jackson.databind.PropertyNamingStrategies; import com.fasterxml.jackson.databind.module.SimpleModule; +import org.opensearch.dataprepper.model.event.EventKey; +import org.opensearch.dataprepper.model.event.EventKeyFactory; import org.opensearch.dataprepper.model.types.ByteCount; import org.opensearch.dataprepper.pipeline.parser.ByteCountDeserializer; import org.opensearch.dataprepper.pipeline.parser.DataPrepperDurationDeserializer; +import org.opensearch.dataprepper.pipeline.parser.EventKeyDeserializer; import org.springframework.context.annotation.Bean; import javax.inject.Named; @@ -38,10 +41,13 @@ ObjectMapper extensionPluginConfigObjectMapper() { } @Bean(name = "pluginConfigObjectMapper") - ObjectMapper pluginConfigObjectMapper(final VariableExpander variableExpander) { + ObjectMapper pluginConfigObjectMapper( + final VariableExpander variableExpander, + final EventKeyFactory eventKeyFactory) { final SimpleModule simpleModule = new SimpleModule(); simpleModule.addDeserializer(Duration.class, new DataPrepperDurationDeserializer()); simpleModule.addDeserializer(ByteCount.class, new ByteCountDeserializer()); + simpleModule.addDeserializer(EventKey.class, new EventKeyDeserializer(eventKeyFactory)); TRANSLATE_VALUE_SUPPORTED_JAVA_TYPES.stream().forEach(clazz -> simpleModule.addDeserializer( clazz, new DataPrepperScalarTypeDeserializer<>(variableExpander, clazz))); diff --git a/data-prepper-plugin-framework/src/test/java/org/opensearch/dataprepper/plugin/ApplicationContextToTypedSuppliersTest.java b/data-prepper-plugin-framework/src/test/java/org/opensearch/dataprepper/plugin/ApplicationContextToTypedSuppliersTest.java index 0cd008559a..a12540a46a 100644 --- a/data-prepper-plugin-framework/src/test/java/org/opensearch/dataprepper/plugin/ApplicationContextToTypedSuppliersTest.java +++ b/data-prepper-plugin-framework/src/test/java/org/opensearch/dataprepper/plugin/ApplicationContextToTypedSuppliersTest.java @@ -12,6 +12,7 @@ import org.opensearch.dataprepper.model.acknowledgements.AcknowledgementSetManager; import org.opensearch.dataprepper.model.breaker.CircuitBreaker; import org.opensearch.dataprepper.model.event.EventFactory; +import org.opensearch.dataprepper.model.event.EventKeyFactory; import java.util.Map; import java.util.function.Supplier; @@ -28,6 +29,9 @@ class ApplicationContextToTypedSuppliersTest { @Mock private EventFactory eventFactory; + @Mock + private EventKeyFactory eventKeyFactory; + @Mock private AcknowledgementSetManager acknowledgementSetManager; @@ -37,6 +41,7 @@ class ApplicationContextToTypedSuppliersTest { private ApplicationContextToTypedSuppliers createObjectUnderTest() { return new ApplicationContextToTypedSuppliers( eventFactory, + eventKeyFactory, acknowledgementSetManager, circuitBreaker ); @@ -58,12 +63,16 @@ void constructor_throws_with_null_AcknowledgementSetManager() { void getArgumentsSuppliers_returns_map_with_expected_classes() { final Map, Supplier> argumentsSuppliers = createObjectUnderTest().getArgumentsSuppliers(); - assertThat(argumentsSuppliers.size(), equalTo(3)); + assertThat(argumentsSuppliers.size(), equalTo(4)); assertThat(argumentsSuppliers, hasKey(EventFactory.class)); assertThat(argumentsSuppliers.get(EventFactory.class), notNullValue()); assertThat(argumentsSuppliers.get(EventFactory.class).get(), equalTo(eventFactory)); + assertThat(argumentsSuppliers, hasKey(EventKeyFactory.class)); + assertThat(argumentsSuppliers.get(EventKeyFactory.class), notNullValue()); + assertThat(argumentsSuppliers.get(EventKeyFactory.class).get(), equalTo(eventKeyFactory)); + assertThat(argumentsSuppliers, hasKey(AcknowledgementSetManager.class)); assertThat(argumentsSuppliers.get(AcknowledgementSetManager.class), notNullValue()); assertThat(argumentsSuppliers.get(AcknowledgementSetManager.class).get(), equalTo(acknowledgementSetManager)); @@ -79,12 +88,16 @@ void getArgumentsSuppliers_returns_map_with_null_optional_CircuitBreaker() { final Map, Supplier> argumentsSuppliers = createObjectUnderTest().getArgumentsSuppliers(); - assertThat(argumentsSuppliers.size(), equalTo(3)); + assertThat(argumentsSuppliers.size(), equalTo(4)); assertThat(argumentsSuppliers, hasKey(EventFactory.class)); assertThat(argumentsSuppliers.get(EventFactory.class), notNullValue()); assertThat(argumentsSuppliers.get(EventFactory.class).get(), equalTo(eventFactory)); + assertThat(argumentsSuppliers, hasKey(EventKeyFactory.class)); + assertThat(argumentsSuppliers.get(EventKeyFactory.class), notNullValue()); + assertThat(argumentsSuppliers.get(EventKeyFactory.class).get(), equalTo(eventKeyFactory)); + assertThat(argumentsSuppliers, hasKey(AcknowledgementSetManager.class)); assertThat(argumentsSuppliers.get(AcknowledgementSetManager.class), notNullValue()); assertThat(argumentsSuppliers.get(AcknowledgementSetManager.class).get(), equalTo(acknowledgementSetManager)); diff --git a/data-prepper-plugin-framework/src/test/java/org/opensearch/dataprepper/plugin/ObjectMapperConfigurationTest.java b/data-prepper-plugin-framework/src/test/java/org/opensearch/dataprepper/plugin/ObjectMapperConfigurationTest.java index d839566680..594d3a47c2 100644 --- a/data-prepper-plugin-framework/src/test/java/org/opensearch/dataprepper/plugin/ObjectMapperConfigurationTest.java +++ b/data-prepper-plugin-framework/src/test/java/org/opensearch/dataprepper/plugin/ObjectMapperConfigurationTest.java @@ -11,6 +11,8 @@ import org.junit.jupiter.api.extension.ExtendWith; import org.mockito.Mock; import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.dataprepper.model.event.EventKey; +import org.opensearch.dataprepper.model.event.EventKeyFactory; import java.time.Duration; import java.util.Arrays; @@ -20,6 +22,8 @@ import static org.hamcrest.MatcherAssert.assertThat; import static org.hamcrest.Matchers.equalTo; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.when; @ExtendWith(MockitoExtension.class) class ObjectMapperConfigurationTest { @@ -28,10 +32,13 @@ class ObjectMapperConfigurationTest { @Mock private VariableExpander variableExpander; + @Mock + private EventKeyFactory eventKeyFactory; + @Test void test_duration_with_pluginConfigObjectMapper() { final String durationTestString = "10s"; - final ObjectMapper objectMapper = objectMapperConfiguration.pluginConfigObjectMapper(variableExpander); + final ObjectMapper objectMapper = objectMapperConfiguration.pluginConfigObjectMapper(variableExpander, eventKeyFactory); final Duration duration = objectMapper.convertValue(durationTestString, Duration.class); assertThat(duration, equalTo(Duration.ofSeconds(10))); } @@ -39,7 +46,7 @@ void test_duration_with_pluginConfigObjectMapper() { @Test void test_enum_with_pluginConfigObjectMapper() { final String testString = "test"; - final ObjectMapper objectMapper = objectMapperConfiguration.pluginConfigObjectMapper(variableExpander); + final ObjectMapper objectMapper = objectMapperConfiguration.pluginConfigObjectMapper(variableExpander, eventKeyFactory); final TestType duration = objectMapper.convertValue(testString, TestType.class); assertThat(duration, equalTo(TestType.fromOptionValue(testString))); } @@ -60,6 +67,16 @@ void test_enum_with_extensionPluginConfigObjectMapper() { assertThat(duration, equalTo(TestType.fromOptionValue(testString))); } + @Test + void test_eventKey_with_pluginConfigObjectMapper() { + final String testKey = "test"; + final EventKey eventKey = mock(EventKey.class); + when(eventKeyFactory.createEventKey(testKey, EventKeyFactory.EventAction.ALL)).thenReturn(eventKey); + final ObjectMapper objectMapper = objectMapperConfiguration.pluginConfigObjectMapper(variableExpander, eventKeyFactory); + final EventKey actualEventKey = objectMapper.convertValue(testKey, EventKey.class); + assertThat(actualEventKey, equalTo(eventKey)); + } + private enum TestType { TEST("test"); diff --git a/data-prepper-plugins/aggregate-processor/build.gradle b/data-prepper-plugins/aggregate-processor/build.gradle index 744986e924..9a3eb4551a 100644 --- a/data-prepper-plugins/aggregate-processor/build.gradle +++ b/data-prepper-plugins/aggregate-processor/build.gradle @@ -19,7 +19,6 @@ dependencies { implementation libs.opentelemetry.proto implementation 'com.fasterxml.jackson.core:jackson-databind' implementation 'io.micrometer:micrometer-core' - testImplementation testLibs.mockito.inline } jacocoTestCoverageVerification { diff --git a/data-prepper-plugins/aggregate-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/aggregate/AggregateProcessor.java b/data-prepper-plugins/aggregate-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/aggregate/AggregateProcessor.java index 8a6f9d5a6a..68cb6f6e65 100644 --- a/data-prepper-plugins/aggregate-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/aggregate/AggregateProcessor.java +++ b/data-prepper-plugins/aggregate-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/aggregate/AggregateProcessor.java @@ -20,6 +20,7 @@ import io.micrometer.core.instrument.Counter; import org.opensearch.dataprepper.plugins.hasher.IdentificationKeysHasher; +import java.math.BigDecimal; import java.util.Collection; import java.util.LinkedList; import java.util.List; @@ -147,6 +148,23 @@ public static long getTimeNanos(final Instant time) { return currentTimeNanos; } + public static Instant convertObjectToInstant(Object timeObject) { + if (timeObject instanceof Instant) { + return (Instant)timeObject; + } else if (timeObject instanceof String) { + return Instant.parse((String)timeObject); + } else if (timeObject instanceof Integer || timeObject instanceof Long) { + long value = ((Number)timeObject).longValue(); + return (value > 1E10) ? Instant.ofEpochMilli(value) : Instant.ofEpochSecond(value); + } else if (timeObject instanceof Double || timeObject instanceof Float || timeObject instanceof BigDecimal) { + double value = ((Number)timeObject).doubleValue(); + long seconds = (long) value; + long nanos = (long) ((value - seconds) * 1_000_000_000); + return Instant.ofEpochSecond(seconds, nanos); + } else { + throw new RuntimeException("Invalid format for time "+timeObject); + } + } @Override public void prepareForShutdown() { diff --git a/data-prepper-plugins/aggregate-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/CountAggregateAction.java b/data-prepper-plugins/aggregate-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/CountAggregateAction.java index fae6a19289..c8fd772336 100644 --- a/data-prepper-plugins/aggregate-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/CountAggregateAction.java +++ b/data-prepper-plugins/aggregate-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/CountAggregateAction.java @@ -18,15 +18,21 @@ import org.opensearch.dataprepper.plugins.processor.aggregate.AggregateActionInput; import org.opensearch.dataprepper.plugins.processor.aggregate.AggregateActionOutput; import org.opensearch.dataprepper.plugins.processor.aggregate.AggregateActionResponse; +import org.opensearch.dataprepper.plugins.processor.aggregate.AggregateProcessor; +import static org.opensearch.dataprepper.plugins.processor.aggregate.AggregateProcessor.getTimeNanos; import org.opensearch.dataprepper.plugins.processor.aggregate.GroupState; import io.opentelemetry.proto.metrics.v1.AggregationTemporality; +import org.opensearch.dataprepper.plugins.hasher.IdentificationKeysHasher; import java.time.Instant; -import java.util.List; import java.time.ZoneId; import java.time.format.DateTimeFormatter; -import java.util.Map; + import java.util.HashMap; +import java.util.HashSet; +import java.util.List; +import java.util.Map; +import java.util.Set; /** * An AggregateAction that combines multiple Events into a single Event. This action will count the number of events with same keys and will create a combined event @@ -36,28 +42,32 @@ @DataPrepperPlugin(name = "count", pluginType = AggregateAction.class, pluginConfigurationType = CountAggregateActionConfig.class) public class CountAggregateAction implements AggregateAction { private static final String DATE_FORMAT = "yyyy-MM-dd'T'HH:mm:ss.SSSXXX"; + private static final String UNIQUE_KEYS_SETKEY = "__unique_keys"; private static final String exemplarKey = "__exemplar"; static final String EVENT_TYPE = "event"; - static final String SUM_METRIC_NAME = "count"; static final String SUM_METRIC_DESCRIPTION = "Number of events"; static final String SUM_METRIC_UNIT = "1"; static final boolean SUM_METRIC_IS_MONOTONIC = true; public final String countKey; public final String startTimeKey; + public final String endTimeKey; public final String outputFormat; private long startTimeNanos; + private final String metricName; + private final IdentificationKeysHasher uniqueKeysHasher; @DataPrepperPluginConstructor public CountAggregateAction(final CountAggregateActionConfig countAggregateActionConfig) { this.countKey = countAggregateActionConfig.getCountKey(); this.startTimeKey = countAggregateActionConfig.getStartTimeKey(); + this.endTimeKey = countAggregateActionConfig.getEndTimeKey(); this.outputFormat = countAggregateActionConfig.getOutputFormat(); - } - - private long getTimeNanos(Instant time) { - final long NANO_MULTIPLIER = 1_000 * 1_000 * 1_000; - long currentTimeNanos = time.getEpochSecond() * NANO_MULTIPLIER + time.getNano(); - return currentTimeNanos; + this.metricName = countAggregateActionConfig.getMetricName(); + if (countAggregateActionConfig.getUniqueKeys() != null) { + this.uniqueKeysHasher = new IdentificationKeysHasher(countAggregateActionConfig.getUniqueKeys()); + } else { + this.uniqueKeysHasher = null; + } } public Exemplar createExemplar(final Event event) { @@ -81,15 +91,45 @@ public Exemplar createExemplar(final Event event) { @Override public AggregateActionResponse handleEvent(final Event event, final AggregateActionInput aggregateActionInput) { final GroupState groupState = aggregateActionInput.getGroupState(); + Instant eventStartTime = Instant.now(); + Instant eventEndTime = eventStartTime; + Object startTime = event.get(startTimeKey, Object.class); + Object endTime = event.get(endTimeKey, Object.class); + + if (startTime != null) { + eventStartTime = AggregateProcessor.convertObjectToInstant(startTime); + } + if (endTime != null) { + eventEndTime = AggregateProcessor.convertObjectToInstant(endTime); + } if (groupState.get(countKey) == null) { - groupState.put(startTimeKey, Instant.now()); groupState.putAll(aggregateActionInput.getIdentificationKeys()); + if (uniqueKeysHasher != null) { + Set uniqueKeysMapSet = new HashSet<>(); + + uniqueKeysMapSet.add(uniqueKeysHasher.createIdentificationKeysMapFromEvent(event)); + groupState.put(UNIQUE_KEYS_SETKEY, uniqueKeysMapSet); + } groupState.put(countKey, 1); groupState.put(exemplarKey, createExemplar(event)); + groupState.put(startTimeKey, eventStartTime); + groupState.put(endTimeKey, eventEndTime); } else { Integer v = (Integer)groupState.get(countKey) + 1; + + if (uniqueKeysHasher != null) { + Set uniqueKeysMapSet = (Set) groupState.get(UNIQUE_KEYS_SETKEY); + uniqueKeysMapSet.add(uniqueKeysHasher.createIdentificationKeysMapFromEvent(event)); + v = uniqueKeysMapSet.size(); + } groupState.put(countKey, v); - } + Instant groupStartTime = (Instant)groupState.get(startTimeKey); + Instant groupEndTime = (Instant)groupState.get(endTimeKey); + if (eventStartTime.isBefore(groupStartTime)) + groupState.put(startTimeKey, eventStartTime); + if (eventEndTime.isAfter(groupEndTime)) + groupState.put(endTimeKey, eventEndTime); + } return AggregateActionResponse.nullEventResponse(); } @@ -98,6 +138,9 @@ public AggregateActionOutput concludeGroup(final AggregateActionInput aggregateA GroupState groupState = aggregateActionInput.getGroupState(); Event event; Instant startTime = (Instant)groupState.get(startTimeKey); + Instant endTime = (Instant)groupState.get(endTimeKey); + groupState.remove(endTimeKey); + groupState.remove(UNIQUE_KEYS_SETKEY); if (outputFormat.equals(OutputFormat.RAW.toString())) { groupState.put(startTimeKey, startTime.atZone(ZoneId.of(ZoneId.systemDefault().toString())).format(DateTimeFormatter.ofPattern(DATE_FORMAT))); event = JacksonEvent.builder() @@ -110,14 +153,14 @@ public AggregateActionOutput concludeGroup(final AggregateActionInput aggregateA groupState.remove(exemplarKey); groupState.remove(countKey); groupState.remove(startTimeKey); - long currentTimeNanos = getTimeNanos(Instant.now()); + long endTimeNanos = getTimeNanos(endTime); long startTimeNanos = getTimeNanos(startTime); Map attr = new HashMap(); groupState.forEach((k, v) -> attr.put((String)k, v)); JacksonSum sum = JacksonSum.builder() - .withName(SUM_METRIC_NAME) + .withName(this.metricName) .withDescription(SUM_METRIC_DESCRIPTION) - .withTime(OTelProtoCodec.convertUnixNanosToISO8601(currentTimeNanos)) + .withTime(OTelProtoCodec.convertUnixNanosToISO8601(endTimeNanos)) .withStartTime(OTelProtoCodec.convertUnixNanosToISO8601(startTimeNanos)) .withIsMonotonic(SUM_METRIC_IS_MONOTONIC) .withUnit(SUM_METRIC_UNIT) @@ -128,7 +171,7 @@ public AggregateActionOutput concludeGroup(final AggregateActionInput aggregateA .build(false); event = (Event)sum; } - + return new AggregateActionOutput(List.of(event)); } } diff --git a/data-prepper-plugins/aggregate-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/CountAggregateActionConfig.java b/data-prepper-plugins/aggregate-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/CountAggregateActionConfig.java index cbe5ebb20b..1144aee261 100644 --- a/data-prepper-plugins/aggregate-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/CountAggregateActionConfig.java +++ b/data-prepper-plugins/aggregate-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/CountAggregateActionConfig.java @@ -5,28 +5,52 @@ package org.opensearch.dataprepper.plugins.processor.aggregate.actions; -import java.util.Set; import java.util.HashSet; +import java.util.List; +import java.util.Set; import com.fasterxml.jackson.annotation.JsonProperty; public class CountAggregateActionConfig { + static final String SUM_METRIC_NAME = "count"; public static final String DEFAULT_COUNT_KEY = "aggr._count"; public static final String DEFAULT_START_TIME_KEY = "aggr._start_time"; + public static final String DEFAULT_END_TIME_KEY = "aggr._end_time"; public static final Set validOutputFormats = new HashSet<>(Set.of(OutputFormat.OTEL_METRICS.toString(), OutputFormat.RAW.toString())); @JsonProperty("count_key") String countKey = DEFAULT_COUNT_KEY; + @JsonProperty("metric_name") + String metricName = SUM_METRIC_NAME; + + @JsonProperty("unique_keys") + List uniqueKeys = null; + @JsonProperty("start_time_key") String startTimeKey = DEFAULT_START_TIME_KEY; + @JsonProperty("end_time_key") + String endTimeKey = DEFAULT_END_TIME_KEY; + @JsonProperty("output_format") String outputFormat = OutputFormat.OTEL_METRICS.toString(); + public String getMetricName() { + return metricName; + } + + public List getUniqueKeys() { + return uniqueKeys; + } + public String getCountKey() { return countKey; } + public String getEndTimeKey() { + return endTimeKey; + } + public String getStartTimeKey() { return startTimeKey; } @@ -37,4 +61,4 @@ public String getOutputFormat() { } return outputFormat; } -} +} diff --git a/data-prepper-plugins/aggregate-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/HistogramAggregateAction.java b/data-prepper-plugins/aggregate-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/HistogramAggregateAction.java index 5e93f305bc..bdb9a3fad6 100644 --- a/data-prepper-plugins/aggregate-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/HistogramAggregateAction.java +++ b/data-prepper-plugins/aggregate-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/HistogramAggregateAction.java @@ -17,6 +17,7 @@ import org.opensearch.dataprepper.model.event.JacksonEvent; import org.opensearch.dataprepper.plugins.processor.aggregate.AggregateAction; import org.opensearch.dataprepper.plugins.processor.aggregate.AggregateActionInput; +import org.opensearch.dataprepper.plugins.processor.aggregate.AggregateProcessor; import org.opensearch.dataprepper.plugins.processor.aggregate.AggregateActionOutput; import org.opensearch.dataprepper.plugins.processor.aggregate.AggregateActionResponse; import org.opensearch.dataprepper.plugins.processor.aggregate.GroupState; @@ -35,15 +36,14 @@ import java.util.ArrayList; /** - * An AggregateAction that combines multiple Events into a single Event. This action will create a combined event with histogram buckets of the values - * of specified list of keys from the groupState on concludeGroup. + * An AggregateAction that combines multiple Events into a single Event. This action will create a combined event with histogram buckets of the values + * of specified list of keys from the groupState on concludeGroup. * @since 2.1 */ @DataPrepperPlugin(name = "histogram", pluginType = AggregateAction.class, pluginConfigurationType = HistogramAggregateActionConfig.class) public class HistogramAggregateAction implements AggregateAction { private static final String DATE_FORMAT = "yyyy-MM-dd'T'HH:mm:ss.SSSXXX"; private static final String EVENT_TYPE = "event"; - public static final String HISTOGRAM_METRIC_NAME = "histogram"; private final String countKey; private final String bucketCountsKey; private final String bucketsKey; @@ -61,6 +61,7 @@ public class HistogramAggregateAction implements AggregateAction { private Event maxEvent; private double minValue; private double maxValue; + private final String metricName; private long startTimeNanos; private double[] buckets; @@ -71,6 +72,7 @@ public HistogramAggregateAction(final HistogramAggregateActionConfig histogramAg List bucketList = histogramAggregateActionConfig.getBuckets(); this.buckets = new double[bucketList.size()+2]; int bucketIdx = 0; + this.metricName = histogramAggregateActionConfig.getMetricName(); this.buckets[bucketIdx++] = -Float.MAX_VALUE; for (int i = 0; i < bucketList.size(); i++) { this.buckets[bucketIdx++] = convertToDouble(bucketList.get(i)); @@ -137,16 +139,29 @@ public AggregateActionResponse handleEvent(final Event event, final AggregateAct return AggregateActionResponse.nullEventResponse(); } double doubleValue = convertToDouble(value); - + int idx = Arrays.binarySearch(this.buckets, doubleValue); if (idx < 0) { idx = -idx-2; } + Instant eventTime = Instant.now(); + Instant eventStartTime = eventTime; + Instant eventEndTime = eventTime; + Object startTime = event.get(startTimeKey, Object.class); + Object endTime = event.get(endTimeKey, Object.class); + if (startTime != null) { + eventStartTime = AggregateProcessor.convertObjectToInstant(startTime); + } + if (endTime != null) { + eventEndTime = AggregateProcessor.convertObjectToInstant(endTime); + } if (groupState.get(bucketCountsKey) == null) { + groupState.put(startTimeKey, eventStartTime); + groupState.put(endTimeKey, eventEndTime); Long[] bucketCountsList = new Long[buckets.length-1]; Arrays.fill(bucketCountsList, (long)0); bucketCountsList[idx]++; - groupState.put(startTimeKey, Instant.now()); + groupState.put(startTimeKey, eventTime); groupState.putAll(aggregateActionInput.getIdentificationKeys()); groupState.put(sumKey, doubleValue); groupState.put(countKey, 1); @@ -180,9 +195,13 @@ public AggregateActionResponse handleEvent(final Event event, final AggregateAct maxValue = doubleValue; } } - } - // Keep over-writing endTime to get the last time a record of this group received - groupState.put(endTimeKey, Instant.now()); + Instant groupStartTime = (Instant)groupState.get(startTimeKey); + Instant groupEndTime = (Instant)groupState.get(endTimeKey); + if (eventStartTime.isBefore(groupStartTime)) + groupState.put(startTimeKey, eventStartTime); + if (eventEndTime.isAfter(groupEndTime)) + groupState.put(endTimeKey, eventEndTime); + } return AggregateActionResponse.nullEventResponse(); } @@ -194,7 +213,7 @@ public AggregateActionOutput concludeGroup(final AggregateActionInput aggregateA Instant endTime = (Instant)groupState.get(endTimeKey); long startTimeNanos = getTimeNanos(startTime); long endTimeNanos = getTimeNanos(endTime); - String histogramKey = HISTOGRAM_METRIC_NAME + "_key"; + String histogramKey = this.metricName + "_key"; List exemplarList = new ArrayList<>(); exemplarList.add(createExemplar("min", minEvent, minValue)); exemplarList.add(createExemplar("max", maxEvent, maxValue)); @@ -227,7 +246,7 @@ public AggregateActionOutput concludeGroup(final AggregateActionInput aggregateA Integer count = (Integer)groupState.get(countKey); String description = String.format("Histogram of %s in the events", key); JacksonHistogram histogram = JacksonHistogram.builder() - .withName(HISTOGRAM_METRIC_NAME) + .withName(this.metricName) .withDescription(description) .withTime(OTelProtoCodec.convertUnixNanosToISO8601(endTimeNanos)) .withStartTime(OTelProtoCodec.convertUnixNanosToISO8601(startTimeNanos)) @@ -247,7 +266,7 @@ public AggregateActionOutput concludeGroup(final AggregateActionInput aggregateA .build(false); event = (Event)histogram; } - + return new AggregateActionOutput(List.of(event)); } } diff --git a/data-prepper-plugins/aggregate-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/HistogramAggregateActionConfig.java b/data-prepper-plugins/aggregate-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/HistogramAggregateActionConfig.java index a173671836..7c998c123d 100644 --- a/data-prepper-plugins/aggregate-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/HistogramAggregateActionConfig.java +++ b/data-prepper-plugins/aggregate-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/HistogramAggregateActionConfig.java @@ -12,6 +12,7 @@ import jakarta.validation.constraints.NotNull; public class HistogramAggregateActionConfig { + public static final String HISTOGRAM_METRIC_NAME = "histogram"; public static final String DEFAULT_GENERATED_KEY_PREFIX = "aggr._"; public static final String SUM_KEY = "sum"; public static final String COUNT_KEY = "count"; @@ -32,6 +33,9 @@ public class HistogramAggregateActionConfig { @NotNull String units; + @JsonProperty("metric_name") + String metricName = HISTOGRAM_METRIC_NAME; + @JsonProperty("generated_key_prefix") String generatedKeyPrefix = DEFAULT_GENERATED_KEY_PREFIX; @@ -45,6 +49,10 @@ public class HistogramAggregateActionConfig { @JsonProperty("record_minmax") boolean recordMinMax = false; + public String getMetricName() { + return metricName; + } + public boolean getRecordMinMax() { return recordMinMax; } diff --git a/data-prepper-plugins/aggregate-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/aggregate/AggregateProcessorIT.java b/data-prepper-plugins/aggregate-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/aggregate/AggregateProcessorIT.java index ea7b6eb416..fc416b0e45 100644 --- a/data-prepper-plugins/aggregate-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/aggregate/AggregateProcessorIT.java +++ b/data-prepper-plugins/aggregate-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/aggregate/AggregateProcessorIT.java @@ -533,7 +533,7 @@ void aggregateWithHistogramAggregateAction() throws InterruptedException, NoSuch countDownLatch.countDown(); }); } - Thread.sleep(GROUP_DURATION_FOR_ONLY_SINGLE_CONCLUDE * 1000); + Thread.sleep(GROUP_DURATION_FOR_ONLY_SINGLE_CONCLUDE * 1500); boolean allThreadsFinished = countDownLatch.await(5L, TimeUnit.SECONDS); assertThat(allThreadsFinished, equalTo(true)); diff --git a/data-prepper-plugins/aggregate-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/aggregate/AggregateProcessorStaticFunctionsTest.java b/data-prepper-plugins/aggregate-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/aggregate/AggregateProcessorStaticFunctionsTest.java new file mode 100644 index 0000000000..8c9892ab29 --- /dev/null +++ b/data-prepper-plugins/aggregate-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/aggregate/AggregateProcessorStaticFunctionsTest.java @@ -0,0 +1,40 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.processor.aggregate; + +import org.junit.jupiter.api.Test; +import static org.hamcrest.CoreMatchers.equalTo; +import static org.hamcrest.MatcherAssert.assertThat; +import static org.junit.jupiter.api.Assertions.assertTrue; + +import java.time.Instant; +import java.time.Duration; + +public class AggregateProcessorStaticFunctionsTest { + @Test + public void testConvertObjectToInstant() { + Instant now = Instant.now(); + assertThat(AggregateProcessor.convertObjectToInstant(now), equalTo(now)); + String nowStr = now.toString(); + long nowSeconds = now.getEpochSecond(); + long nowMillis = now.toEpochMilli(); + int nowNanos = now.getNano(); + double nowDouble = nowSeconds+(double)nowNanos/1000_000_000; + assertThat(AggregateProcessor.convertObjectToInstant(nowStr), equalTo(now)); + assertThat(AggregateProcessor.convertObjectToInstant(nowSeconds), equalTo(Instant.ofEpochSecond(nowSeconds))); + assertThat(AggregateProcessor.convertObjectToInstant(nowMillis), equalTo(Instant.ofEpochMilli(nowMillis))); + Duration tolerance = Duration.ofNanos(1000); + assertTrue((Duration.between(AggregateProcessor.convertObjectToInstant(nowDouble), Instant.ofEpochSecond(nowSeconds, nowNanos))).abs().compareTo(tolerance) <= 0); + } + + @Test + public void testGetTimeNanos() { + Instant now = Instant.now(); + assertThat(AggregateProcessor.getTimeNanos(now) / 1000_000_000, equalTo(now.getEpochSecond())); + assertThat(AggregateProcessor.getTimeNanos(now) % 1000_000_000, equalTo((long)now.getNano())); + } +} + diff --git a/data-prepper-plugins/aggregate-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/CountAggregateActionConfigTests.java b/data-prepper-plugins/aggregate-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/CountAggregateActionConfigTests.java index f022ac9148..1975918e37 100644 --- a/data-prepper-plugins/aggregate-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/CountAggregateActionConfigTests.java +++ b/data-prepper-plugins/aggregate-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/CountAggregateActionConfigTests.java @@ -13,6 +13,8 @@ import static org.opensearch.dataprepper.plugins.processor.aggregate.actions.CountAggregateActionConfig.DEFAULT_COUNT_KEY; import static org.opensearch.dataprepper.plugins.processor.aggregate.actions.CountAggregateActionConfig.DEFAULT_START_TIME_KEY; +import java.util.ArrayList; +import java.util.List; import java.util.UUID; import static org.hamcrest.CoreMatchers.equalTo; @@ -38,6 +40,8 @@ void testDefault() { assertThat(countAggregateActionConfig.getCountKey(), equalTo(DEFAULT_COUNT_KEY)); assertThat(countAggregateActionConfig.getStartTimeKey(), equalTo(DEFAULT_START_TIME_KEY)); assertThat(countAggregateActionConfig.getOutputFormat(), equalTo(OutputFormat.OTEL_METRICS.toString())); + assertThat(countAggregateActionConfig.getMetricName(), equalTo(CountAggregateActionConfig.SUM_METRIC_NAME)); + assertThat(countAggregateActionConfig.getUniqueKeys(), equalTo(null)); } @Test @@ -51,6 +55,14 @@ void testValidConfig() throws NoSuchFieldException, IllegalAccessException { final String testOutputFormat = OutputFormat.OTEL_METRICS.toString(); setField(CountAggregateActionConfig.class, countAggregateActionConfig, "outputFormat", testOutputFormat); assertThat(countAggregateActionConfig.getOutputFormat(), equalTo(OutputFormat.OTEL_METRICS.toString())); + final String testName = UUID.randomUUID().toString(); + setField(CountAggregateActionConfig.class, countAggregateActionConfig, "metricName", testName); + assertThat(countAggregateActionConfig.getMetricName(), equalTo(testName)); + final List uniqueKeys = new ArrayList<>(); + uniqueKeys.add(UUID.randomUUID().toString()); + uniqueKeys.add(UUID.randomUUID().toString()); + setField(CountAggregateActionConfig.class, countAggregateActionConfig, "uniqueKeys", uniqueKeys); + assertThat(countAggregateActionConfig.getUniqueKeys(), equalTo(uniqueKeys)); } @Test diff --git a/data-prepper-plugins/aggregate-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/CountAggregateActionTest.java b/data-prepper-plugins/aggregate-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/CountAggregateActionTest.java index 66936fa7f8..af81ca001f 100644 --- a/data-prepper-plugins/aggregate-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/CountAggregateActionTest.java +++ b/data-prepper-plugins/aggregate-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/CountAggregateActionTest.java @@ -25,8 +25,12 @@ import java.util.HashMap; import java.util.Map; import java.util.List; +import java.util.Random; import java.util.UUID; +import java.time.Instant; +import static org.mockito.Mockito.when; +import static org.mockito.Mockito.mock; import static org.hamcrest.CoreMatchers.equalTo; import static org.hamcrest.CoreMatchers.not; import static org.hamcrest.MatcherAssert.assertThat; @@ -46,6 +50,7 @@ private AggregateAction createObjectUnderTest(CountAggregateActionConfig config) @ParameterizedTest @ValueSource(ints = {1, 2, 10, 100}) void testCountAggregate(int testCount) throws NoSuchFieldException, IllegalAccessException { + final String testName = UUID.randomUUID().toString(); CountAggregateActionConfig countAggregateActionConfig = new CountAggregateActionConfig(); setField(CountAggregateActionConfig.class, countAggregateActionConfig, "outputFormat", OutputFormat.RAW.toString()); countAggregateAction = createObjectUnderTest(countAggregateActionConfig); @@ -75,8 +80,10 @@ void testCountAggregate(int testCount) throws NoSuchFieldException, IllegalAcces @ParameterizedTest @ValueSource(ints = {1, 2, 10, 100}) - void testCountAggregateOTelFormat(int testCount) { + void testCountAggregateOTelFormat(int testCount) throws NoSuchFieldException, IllegalAccessException { CountAggregateActionConfig countAggregateActionConfig = new CountAggregateActionConfig(); + final String testName = UUID.randomUUID().toString(); + setField(CountAggregateActionConfig.class, countAggregateActionConfig, "metricName", testName); countAggregateAction = createObjectUnderTest(countAggregateActionConfig); final String key1 = "key-"+UUID.randomUUID().toString(); final String value1 = UUID.randomUUID().toString(); @@ -115,6 +122,7 @@ void testCountAggregateOTelFormat(int testCount) { expectedEventMap.put("isMonotonic", true); expectedEventMap.put("aggregationTemporality", "AGGREGATION_TEMPORALITY_DELTA"); expectedEventMap.put("unit", "1"); + expectedEventMap.put("name", testName); expectedEventMap.forEach((k, v) -> assertThat(result.get(0).toMap(), hasEntry(k,v))); assertThat(result.get(0).toMap().get("attributes"), equalTo(eventMap)); JacksonMetric metric = (JacksonMetric) result.get(0); @@ -139,4 +147,144 @@ void testCountAggregateOTelFormat(int testCount) { assertThat(attributes.get(key2), equalTo(value2)); assertTrue(attributes.containsKey(dataKey2)); } + + @ParameterizedTest + @ValueSource(ints = {1, 2, 10, 100}) + void testCountAggregateOTelFormatWithStartAndEndTimesInTheEvent(int testCount) { + CountAggregateActionConfig mockConfig = mock(CountAggregateActionConfig.class); + when(mockConfig.getCountKey()).thenReturn(CountAggregateActionConfig.DEFAULT_COUNT_KEY); + when(mockConfig.getUniqueKeys()).thenReturn(null); + final String testName = UUID.randomUUID().toString(); + when(mockConfig.getMetricName()).thenReturn(testName); + String startTimeKey = UUID.randomUUID().toString(); + String endTimeKey = UUID.randomUUID().toString(); + when(mockConfig.getStartTimeKey()).thenReturn(startTimeKey); + when(mockConfig.getEndTimeKey()).thenReturn(endTimeKey); + when(mockConfig.getOutputFormat()).thenReturn(OutputFormat.OTEL_METRICS.toString()); + countAggregateAction = createObjectUnderTest(mockConfig); + final String key1 = "key-"+UUID.randomUUID().toString(); + final String value1 = UUID.randomUUID().toString(); + final String dataKey1 = "datakey-"+UUID.randomUUID().toString(); + final String key2 = "key-"+UUID.randomUUID().toString(); + final String value2 = UUID.randomUUID().toString(); + final String dataKey2 = "datakey-"+UUID.randomUUID().toString(); + final Instant testTime = Instant.ofEpochSecond(Instant.now().getEpochSecond()); + Map eventMap = Collections.singletonMap(key1, value1); + Event testEvent = JacksonEvent.builder() + .withEventType("event") + .withData(eventMap) + .build(); + Map eventMap2 = Collections.singletonMap(key2, value2); + JacksonEvent testEvent2 = JacksonEvent.builder() + .withEventType("event") + .withData(eventMap2) + .build(); + AggregateActionInput aggregateActionInput = new AggregateActionTestUtils.TestAggregateActionInput(eventMap); + AggregateActionInput aggregateActionInput2 = new AggregateActionTestUtils.TestAggregateActionInput(eventMap2); + Random random = new Random(); + for (int i = 0; i < testCount; i++) { + testEvent.put(dataKey1, UUID.randomUUID().toString()); + Instant sTime = (i == 0) ? testTime : testTime.plusSeconds(random.nextInt(5)); + Instant eTime = (i == testCount-1) ? testTime.plusSeconds(100) : testTime.plusSeconds (50+random.nextInt(45)); + testEvent.put(startTimeKey, sTime); + testEvent.put(endTimeKey, eTime); + testEvent2.put(dataKey2, UUID.randomUUID().toString()); + testEvent2.put(startTimeKey, sTime.toString()); + testEvent2.put(endTimeKey, eTime.toString()); + AggregateActionResponse aggregateActionResponse = countAggregateAction.handleEvent(testEvent, aggregateActionInput); + assertThat(aggregateActionResponse.getEvent(), equalTo(null)); + aggregateActionResponse = countAggregateAction.handleEvent(testEvent2, aggregateActionInput2); + assertThat(aggregateActionResponse.getEvent(), equalTo(null)); + } + + AggregateActionOutput actionOutput = countAggregateAction.concludeGroup(aggregateActionInput); + final List result = actionOutput.getEvents(); + assertThat(result.size(), equalTo(1)); + Map expectedEventMap = new HashMap<>(); + expectedEventMap.put("value", (double)testCount); + expectedEventMap.put("name", testName); + expectedEventMap.put("description", "Number of events"); + expectedEventMap.put("isMonotonic", true); + expectedEventMap.put("aggregationTemporality", "AGGREGATION_TEMPORALITY_DELTA"); + expectedEventMap.put("unit", "1"); + expectedEventMap.forEach((k, v) -> assertThat(result.get(0).toMap(), hasEntry(k,v))); + assertThat(result.get(0).toMap().get("attributes"), equalTo(eventMap)); + JacksonMetric metric = (JacksonMetric) result.get(0); + assertThat(metric.toJsonString().indexOf("attributes"), not(-1)); + assertThat(result.get(0).get("startTime", String.class), equalTo(testTime.toString())); + assertThat(result.get(0).get("time", String.class), equalTo(testTime.plusSeconds(100).toString())); + + assertThat(result.get(0).toMap(), hasKey("startTime")); + assertThat(result.get(0).toMap(), hasKey("time")); + List> exemplars = (List >)result.get(0).toMap().get("exemplars"); + assertThat(exemplars.size(), equalTo(1)); + Map exemplar = exemplars.get(0); + Map attributes = (Map)exemplar.get("attributes"); + assertThat(attributes.get(key1), equalTo(value1)); + assertTrue(attributes.containsKey(dataKey1)); + + actionOutput = countAggregateAction.concludeGroup(aggregateActionInput2); + final List result2 = actionOutput.getEvents(); + assertThat(result2.size(), equalTo(1)); + + exemplars = (List >)result2.get(0).toMap().get("exemplars"); + assertThat(exemplars.size(), equalTo(1)); + exemplar = exemplars.get(0); + attributes = (Map)exemplar.get("attributes"); + assertThat(attributes.get(key2), equalTo(value2)); + assertTrue(attributes.containsKey(dataKey2)); + } + + @ParameterizedTest + @ValueSource(ints = {1, 2, 3, 10, 20}) + void testCountAggregateOTelFormatUniqueKeys(int testCount) throws NoSuchFieldException, IllegalAccessException { + CountAggregateActionConfig countAggregateActionConfig = new CountAggregateActionConfig(); + final String testName = UUID.randomUUID().toString(); + setField(CountAggregateActionConfig.class, countAggregateActionConfig, "metricName", testName); + final String key1 = "key-"+UUID.randomUUID().toString(); + final String value1 = UUID.randomUUID().toString(); + final String dataKey1 = "datakey-"+UUID.randomUUID().toString(); + setField(CountAggregateActionConfig.class, countAggregateActionConfig, "uniqueKeys", List.of(dataKey1)); + countAggregateAction = createObjectUnderTest(countAggregateActionConfig); + Map eventMap = Collections.singletonMap(key1, value1); + Event testEvent = JacksonEvent.builder() + .withEventType("event") + .withData(eventMap) + .build(); + AggregateActionInput aggregateActionInput = new AggregateActionTestUtils.TestAggregateActionInput(eventMap); + final String dataKey1_1 = UUID.randomUUID().toString(); + final String dataKey1_2 = UUID.randomUUID().toString(); + final String dataKey1_3 = UUID.randomUUID().toString(); + final String[] dataKeysList = {dataKey1_1, dataKey1_2, dataKey1_3}; + for (int i = 0; i < testCount; i++) { + testEvent.put(dataKey1, dataKeysList[i % 3]); + AggregateActionResponse aggregateActionResponse = countAggregateAction.handleEvent(testEvent, aggregateActionInput); + assertThat(aggregateActionResponse.getEvent(), equalTo(null)); + } + + AggregateActionOutput actionOutput = countAggregateAction.concludeGroup(aggregateActionInput); + final List result = actionOutput.getEvents(); + assertThat(result.size(), equalTo(1)); + Map expectedEventMap = new HashMap<>(); + double expectedCount = (testCount >= 3) ? 3 : testCount; + expectedEventMap.put("value", expectedCount); + expectedEventMap.put("description", "Number of events"); + expectedEventMap.put("isMonotonic", true); + expectedEventMap.put("aggregationTemporality", "AGGREGATION_TEMPORALITY_DELTA"); + expectedEventMap.put("unit", "1"); + expectedEventMap.put("name", testName); + expectedEventMap.forEach((k, v) -> assertThat(result.get(0).toMap(), hasEntry(k,v))); + assertThat(result.get(0).toMap().get("attributes"), equalTo(eventMap)); + JacksonMetric metric = (JacksonMetric) result.get(0); + assertThat(metric.toJsonString().indexOf("attributes"), not(-1)); + assertThat(result.get(0).toMap(), hasKey("startTime")); + assertThat(result.get(0).toMap(), hasKey("time")); + List> exemplars = (List >)result.get(0).toMap().get("exemplars"); + assertThat(exemplars.size(), equalTo(1)); + Map exemplar = exemplars.get(0); + Map attributes = (Map)exemplar.get("attributes"); + assertThat(attributes.get(key1), equalTo(value1)); + assertTrue(attributes.containsKey(dataKey1)); + + } } diff --git a/data-prepper-plugins/aggregate-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/HistogramAggregateActionConfigTests.java b/data-prepper-plugins/aggregate-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/HistogramAggregateActionConfigTests.java index f3e1e19d25..60ba8dc202 100644 --- a/data-prepper-plugins/aggregate-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/HistogramAggregateActionConfigTests.java +++ b/data-prepper-plugins/aggregate-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/HistogramAggregateActionConfigTests.java @@ -11,6 +11,8 @@ import org.mockito.junit.jupiter.MockitoExtension; import org.apache.commons.lang3.RandomStringUtils; +import java.util.UUID; + import static org.opensearch.dataprepper.plugins.processor.aggregate.actions.HistogramAggregateActionConfig.DEFAULT_GENERATED_KEY_PREFIX; import java.util.concurrent.ThreadLocalRandom; @@ -41,6 +43,7 @@ void testDefault() { assertThat(histogramAggregateActionConfig.getGeneratedKeyPrefix(), equalTo(DEFAULT_GENERATED_KEY_PREFIX)); assertThat(histogramAggregateActionConfig.getRecordMinMax(), equalTo(false)); assertThat(histogramAggregateActionConfig.getOutputFormat(), equalTo(OutputFormat.OTEL_METRICS.toString())); + assertThat(histogramAggregateActionConfig.getMetricName(), equalTo(HistogramAggregateActionConfig.HISTOGRAM_METRIC_NAME)); } @Test @@ -106,6 +109,9 @@ void testValidConfig() throws NoSuchFieldException, IllegalAccessException { longBuckets.add(longValue2); setField(HistogramAggregateActionConfig.class, histogramAggregateActionConfig, "buckets", longBuckets); assertThat(histogramAggregateActionConfig.getBuckets(), containsInAnyOrder(longBuckets.toArray())); + final String testName = UUID.randomUUID().toString(); + setField(HistogramAggregateActionConfig.class, histogramAggregateActionConfig, "metricName", testName); + assertThat(histogramAggregateActionConfig.getMetricName(), equalTo(testName)); } @Test diff --git a/data-prepper-plugins/aggregate-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/HistogramAggregateActionTests.java b/data-prepper-plugins/aggregate-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/HistogramAggregateActionTests.java index b2b498306b..155acee918 100644 --- a/data-prepper-plugins/aggregate-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/HistogramAggregateActionTests.java +++ b/data-prepper-plugins/aggregate-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/aggregate/actions/HistogramAggregateActionTests.java @@ -23,13 +23,18 @@ import org.apache.commons.lang3.RandomStringUtils; import java.util.Arrays; +import java.util.ArrayList; import java.util.Collections; import java.util.HashMap; import java.util.Map; import java.util.List; -import java.util.ArrayList; +import java.util.Random; +import java.util.UUID; +import java.time.Instant; import java.util.concurrent.ThreadLocalRandom; +import static org.mockito.Mockito.when; +import static org.mockito.Mockito.mock; import static org.hamcrest.CoreMatchers.equalTo; import static org.hamcrest.CoreMatchers.not; import static org.hamcrest.MatcherAssert.assertThat; @@ -193,7 +198,7 @@ void testHistogramAggregateOTelFormat(int testCount) throws NoSuchFieldException final String expectedStartTimeKey = histogramAggregateActionConfig.getStartTimeKey(); Map expectedEventMap = new HashMap<>(Collections.singletonMap("count", (long)testCount)); expectedEventMap.put("unit", testUnits); - expectedEventMap.put("name", HistogramAggregateAction.HISTOGRAM_METRIC_NAME); + expectedEventMap.put("name", HistogramAggregateActionConfig.HISTOGRAM_METRIC_NAME); expectedEventMap.put("sum", expectedSum); expectedEventMap.put("min", expectedMin); expectedEventMap.put("max", expectedMax); @@ -207,7 +212,7 @@ void testHistogramAggregateOTelFormat(int testCount) throws NoSuchFieldException for (int i = 0; i < expectedBucketCounts.length; i++) { assertThat(expectedBucketCounts[i], equalTo(bucketCountsFromResult.get(i))); } - assertThat(((Map)result.get(0).toMap().get("attributes")), hasEntry(HistogramAggregateAction.HISTOGRAM_METRIC_NAME+"_key", testKey)); + assertThat(((Map)result.get(0).toMap().get("attributes")), hasEntry(HistogramAggregateActionConfig.HISTOGRAM_METRIC_NAME+"_key", testKey)); List exemplars = (List )result.get(0).toMap().get("exemplars"); assertThat(exemplars.size(), equalTo(2)); assertThat(((Map)result.get(0).toMap().get("attributes")), hasEntry(dataKey, dataValue)); @@ -235,4 +240,134 @@ void testHistogramAggregateOTelFormat(int testCount) throws NoSuchFieldException } } } + + @ParameterizedTest + @ValueSource(ints = {10, 20, 50, 100}) + void testHistogramAggregateOTelFormatWithStartAndEndTimesInTheEvent(int testCount) throws NoSuchFieldException, IllegalAccessException { + HistogramAggregateActionConfig mockConfig = mock(HistogramAggregateActionConfig.class); + String startTimeKey = UUID.randomUUID().toString(); + String endTimeKey = UUID.randomUUID().toString(); + final String testKeyPrefix = RandomStringUtils.randomAlphabetic(5)+"_"; + when(mockConfig.getStartTimeKey()).thenReturn(startTimeKey); + when(mockConfig.getEndTimeKey()).thenReturn(endTimeKey); + final String testName = UUID.randomUUID().toString(); + when(mockConfig.getMetricName()).thenReturn(testName); + when(mockConfig.getOutputFormat()).thenReturn(OutputFormat.OTEL_METRICS.toString()); + String keyPrefix = UUID.randomUUID().toString(); + final String testUnits = "ms"; + when(mockConfig.getUnits()).thenReturn(testUnits); + when(mockConfig.getRecordMinMax()).thenReturn(true); + final double TEST_VALUE_RANGE_MIN = 0.0; + final double TEST_VALUE_RANGE_MAX = 6.0; + final double TEST_VALUE_RANGE_STEP = 2.0; + final double bucket1 = TEST_VALUE_RANGE_MIN; + final double bucket2 = bucket1 + TEST_VALUE_RANGE_STEP; + final double bucket3 = bucket2 + TEST_VALUE_RANGE_STEP; + List buckets = new ArrayList(); + buckets.add(bucket1); + buckets.add(bucket2); + buckets.add(bucket3); + when(mockConfig.getBuckets()).thenReturn(buckets); + final String testKey = RandomStringUtils.randomAlphabetic(10); + when(mockConfig.getKey()).thenReturn(testKey); + final String testPrefix = RandomStringUtils.randomAlphabetic(7); + when(mockConfig.getSumKey()).thenReturn(testPrefix+"sum"); + when(mockConfig.getMinKey()).thenReturn(testPrefix+"min"); + when(mockConfig.getMaxKey()).thenReturn(testPrefix+"max"); + when(mockConfig.getCountKey()).thenReturn(testPrefix+"count"); + when(mockConfig.getBucketsKey()).thenReturn(testPrefix+"buckets"); + when(mockConfig.getBucketCountsKey()).thenReturn(testPrefix+"bucketcounts"); + when(mockConfig.getDurationKey()).thenReturn(testPrefix+"duration"); + histogramAggregateAction = createObjectUnderTest(mockConfig); + final String dataKey = RandomStringUtils.randomAlphabetic(10); + final String dataValue = RandomStringUtils.randomAlphabetic(15); + final AggregateActionInput aggregateActionInput = new AggregateActionTestUtils.TestAggregateActionInput(Map.of(dataKey, dataValue)); + Long[] expectedBucketCounts = new Long[buckets.size()+1]; + double expectedSum = 0.0; + double expectedMin = TEST_VALUE_RANGE_MAX+TEST_VALUE_RANGE_STEP+1.0; + double expectedMax = TEST_VALUE_RANGE_MIN-TEST_VALUE_RANGE_STEP-1.0; + Arrays.fill(expectedBucketCounts, (long)0); + Random random = new Random(); + final Instant testTime = Instant.ofEpochSecond(Instant.now().getEpochSecond()); + for (int i = 0; i < testCount; i++) { + final double value = ThreadLocalRandom.current().nextDouble(TEST_VALUE_RANGE_MIN-TEST_VALUE_RANGE_STEP, TEST_VALUE_RANGE_MAX+TEST_VALUE_RANGE_STEP); + if (value < bucket1) { + expectedBucketCounts[0]++; + } else if (value < bucket2) { + expectedBucketCounts[1]++; + } else if (value < bucket3) { + expectedBucketCounts[2]++; + } else { + expectedBucketCounts[3]++; + } + expectedSum += value; + if (value < expectedMin) { + expectedMin = value; + } + if (value > expectedMax) { + expectedMax = value; + } + Instant sTime = (i == 0) ? testTime : testTime.plusSeconds(random.nextInt(5)); + Instant eTime = (i == testCount-1) ? testTime.plusSeconds(100) : testTime.plusSeconds (50+random.nextInt(45)); + Map eventMap = Collections.synchronizedMap(Map.of(testKey, value, startTimeKey, sTime, endTimeKey, eTime)); + Event testEvent = JacksonEvent.builder() + .withEventType("event") + .withData(eventMap) + .build(); + final AggregateActionResponse aggregateActionResponse = histogramAggregateAction.handleEvent(testEvent, aggregateActionInput); + assertThat(aggregateActionResponse.getEvent(), equalTo(null)); + } + + final AggregateActionOutput actionOutput = histogramAggregateAction.concludeGroup(aggregateActionInput); + final List result = actionOutput.getEvents(); + assertThat(result.size(), equalTo(1)); + final String expectedCountKey = mockConfig.getCountKey(); + final String expectedStartTimeKey = mockConfig.getStartTimeKey(); + Map expectedEventMap = new HashMap<>(Collections.singletonMap("count", (long)testCount)); + expectedEventMap.put("unit", testUnits); + expectedEventMap.put("name", testName); + expectedEventMap.put("sum", expectedSum); + expectedEventMap.put("min", expectedMin); + expectedEventMap.put("max", expectedMax); + expectedEventMap.put("bucketCounts", expectedBucketCounts.length); + expectedEventMap.put("explicitBoundsCount", expectedBucketCounts.length-1); + + expectedEventMap.forEach((k, v) -> assertThat(result.get(0).toMap(), hasEntry(k, v))); + assertThat(result.get(0).toMap(), hasKey("startTime")); + assertThat(result.get(0).toMap(), hasKey("time")); + final List bucketCountsFromResult = (ArrayList)result.get(0).toMap().get("bucketCountsList"); + for (int i = 0; i < expectedBucketCounts.length; i++) { + assertThat(expectedBucketCounts[i], equalTo(bucketCountsFromResult.get(i))); + } + assertThat(((Map)result.get(0).toMap().get("attributes")), hasEntry(testName+"_key", testKey)); + List exemplars = (List )result.get(0).toMap().get("exemplars"); + assertThat(exemplars.size(), equalTo(2)); + assertThat(((Map)result.get(0).toMap().get("attributes")), hasEntry(dataKey, dataValue)); + final String expectedDurationKey = mockConfig.getDurationKey(); + assertThat(((Map)result.get(0).toMap().get("attributes")), hasKey(expectedDurationKey)); + JacksonMetric metric = (JacksonMetric) result.get(0); + assertThat(metric.toJsonString().indexOf("attributes"), not(-1)); + final List explicitBoundsFromResult = (ArrayList)result.get(0).toMap().get("explicitBounds"); + double bucketVal = TEST_VALUE_RANGE_MIN; + for (int i = 0; i < explicitBoundsFromResult.size(); i++) { + assertThat(explicitBoundsFromResult.get(i), equalTo(bucketVal)); + bucketVal += TEST_VALUE_RANGE_STEP; + } + final List> bucketsFromResult = (ArrayList>)result.get(0).toMap().get("buckets"); + double expectedBucketMin = -Float.MAX_VALUE; + double expectedBucketMax = TEST_VALUE_RANGE_MIN; + for (int i = 0; i < bucketsFromResult.size(); i++) { + assertThat(bucketsFromResult.get(i), hasEntry("min", expectedBucketMin)); + assertThat(bucketsFromResult.get(i), hasEntry("max", expectedBucketMax)); + assertThat(bucketsFromResult.get(i), hasEntry("count", expectedBucketCounts[i])); + expectedBucketMin = expectedBucketMax; + expectedBucketMax += TEST_VALUE_RANGE_STEP; + if (i == bucketsFromResult.size()-2) { + expectedBucketMax = Float.MAX_VALUE; + } + } + + assertThat(result.get(0).get("startTime", String.class), equalTo(testTime.toString())); + assertThat(result.get(0).get("time", String.class), equalTo(testTime.plusSeconds(100).toString())); + } } diff --git a/data-prepper-plugins/armeria-common/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker b/data-prepper-plugins/armeria-common/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker deleted file mode 100644 index ca6ee9cea8..0000000000 --- a/data-prepper-plugins/armeria-common/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker +++ /dev/null @@ -1 +0,0 @@ -mock-maker-inline \ No newline at end of file diff --git a/data-prepper-plugins/avro-codecs/build.gradle b/data-prepper-plugins/avro-codecs/build.gradle index e6c5ea5e54..2bce28bbe0 100644 --- a/data-prepper-plugins/avro-codecs/build.gradle +++ b/data-prepper-plugins/avro-codecs/build.gradle @@ -6,7 +6,7 @@ dependencies { implementation project(path: ':data-prepper-api') implementation libs.avro.core - implementation 'org.apache.parquet:parquet-common:1.14.0' + implementation libs.parquet.common implementation 'software.amazon.awssdk:s3' implementation 'software.amazon.awssdk:apache-client' testImplementation 'org.json:json:20240205' diff --git a/data-prepper-plugins/avro-codecs/src/test/java/org/opensearch/dataprepper/avro/AvroAutoSchemaGeneratorTest.java b/data-prepper-plugins/avro-codecs/src/test/java/org/opensearch/dataprepper/avro/AvroAutoSchemaGeneratorTest.java index 622eb56a1b..1b66b62c37 100644 --- a/data-prepper-plugins/avro-codecs/src/test/java/org/opensearch/dataprepper/avro/AvroAutoSchemaGeneratorTest.java +++ b/data-prepper-plugins/avro-codecs/src/test/java/org/opensearch/dataprepper/avro/AvroAutoSchemaGeneratorTest.java @@ -17,7 +17,7 @@ import java.util.Collections; import java.util.List; import java.util.Map; -import java.util.Random; +import java.util.Timer; import java.util.UUID; import java.util.stream.Stream; @@ -218,7 +218,7 @@ static class SomeUnknownTypesArgumentsProvider implements ArgumentsProvider { @Override public Stream provideArguments(ExtensionContext context) { return Stream.of( - arguments(Random.class), + arguments(Timer.class), arguments(InputStream.class), arguments(File.class) ); diff --git a/data-prepper-plugins/aws-plugin/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker b/data-prepper-plugins/aws-plugin/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker deleted file mode 100644 index 23c33feb6d..0000000000 --- a/data-prepper-plugins/aws-plugin/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker +++ /dev/null @@ -1,3 +0,0 @@ -# To enable mocking of final classes with vanilla Mockito -# https://github.com/mockito/mockito/wiki/What%27s-new-in-Mockito-2#mock-the-unmockable-opt-in-mocking-of-final-classesmethods -mock-maker-inline diff --git a/data-prepper-plugins/blocking-buffer/src/test/java/org/opensearch/dataprepper/plugins/buffer/blockingbuffer/BlockingBufferTests.java b/data-prepper-plugins/blocking-buffer/src/test/java/org/opensearch/dataprepper/plugins/buffer/blockingbuffer/BlockingBufferTests.java index 194c810ec4..f3f28db174 100644 --- a/data-prepper-plugins/blocking-buffer/src/test/java/org/opensearch/dataprepper/plugins/buffer/blockingbuffer/BlockingBufferTests.java +++ b/data-prepper-plugins/blocking-buffer/src/test/java/org/opensearch/dataprepper/plugins/buffer/blockingbuffer/BlockingBufferTests.java @@ -328,7 +328,7 @@ public Stream provideArguments(final ExtensionContext conte return Stream.of( Arguments.of(0, randomInt + 1, 0.0), Arguments.of(1, 100, 1.0), - Arguments.of(randomInt, randomInt, 100.0), + Arguments.of(randomInt + 1, randomInt + 1, 100.0), Arguments.of(randomInt, randomInt + 250, ((double) randomInt / (randomInt + 250)) * 100), Arguments.of(6, 9, 66.66666666666666), Arguments.of(531, 1000, 53.1), diff --git a/data-prepper-plugins/cloudwatch-logs/build.gradle b/data-prepper-plugins/cloudwatch-logs/build.gradle index dc374997f0..3bbb24f443 100644 --- a/data-prepper-plugins/cloudwatch-logs/build.gradle +++ b/data-prepper-plugins/cloudwatch-logs/build.gradle @@ -16,7 +16,6 @@ dependencies { implementation 'org.projectlombok:lombok:1.18.26' implementation 'org.hibernate.validator:hibernate-validator:8.0.0.Final' testImplementation project(path: ':data-prepper-test-common') - testImplementation testLibs.mockito.inline compileOnly 'org.projectlombok:lombok:1.18.24' annotationProcessor 'org.projectlombok:lombok:1.18.24' } diff --git a/data-prepper-plugins/cloudwatch-metrics-source/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker b/data-prepper-plugins/cloudwatch-metrics-source/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker deleted file mode 100644 index 23c33feb6d..0000000000 --- a/data-prepper-plugins/cloudwatch-metrics-source/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker +++ /dev/null @@ -1,3 +0,0 @@ -# To enable mocking of final classes with vanilla Mockito -# https://github.com/mockito/mockito/wiki/What%27s-new-in-Mockito-2#mock-the-unmockable-opt-in-mocking-of-final-classesmethods -mock-maker-inline diff --git a/data-prepper-plugins/common/build.gradle b/data-prepper-plugins/common/build.gradle index 947d1234d4..cdfdeab9ef 100644 --- a/data-prepper-plugins/common/build.gradle +++ b/data-prepper-plugins/common/build.gradle @@ -19,12 +19,11 @@ dependencies { implementation libs.bouncycastle.bcpkix implementation libs.reflections.core implementation 'io.micrometer:micrometer-core' - implementation 'org.apache.parquet:parquet-common:1.14.0' + implementation libs.parquet.common implementation 'org.xerial.snappy:snappy-java:1.1.10.5' testImplementation project(':data-prepper-plugins:blocking-buffer') testImplementation project(':data-prepper-test-event') testImplementation libs.commons.io - testImplementation testLibs.mockito.inline } jacocoTestCoverageVerification { diff --git a/data-prepper-plugins/common/src/main/java/org/opensearch/dataprepper/plugins/processor/StringProcessor.java b/data-prepper-plugins/common/src/main/java/org/opensearch/dataprepper/plugins/processor/StringProcessor.java index aa2930e634..3cf2953e06 100644 --- a/data-prepper-plugins/common/src/main/java/org/opensearch/dataprepper/plugins/processor/StringProcessor.java +++ b/data-prepper-plugins/common/src/main/java/org/opensearch/dataprepper/plugins/processor/StringProcessor.java @@ -5,6 +5,7 @@ package org.opensearch.dataprepper.plugins.processor; +import com.fasterxml.jackson.annotation.JsonPropertyDescription; import org.opensearch.dataprepper.model.annotations.DataPrepperPlugin; import org.opensearch.dataprepper.model.annotations.DataPrepperPluginConstructor; import org.opensearch.dataprepper.model.configuration.PluginSetting; @@ -40,6 +41,7 @@ public class StringProcessor implements Processor, Record> private final boolean upperCase; public static class Configuration { + @JsonPropertyDescription("Whether to convert to uppercase (`true`) or lowercase (`false`).") private boolean upperCase = true; public boolean getUpperCase() { diff --git a/data-prepper-plugins/csv-processor/build.gradle b/data-prepper-plugins/csv-processor/build.gradle index 56c02daf83..cda0694a66 100644 --- a/data-prepper-plugins/csv-processor/build.gradle +++ b/data-prepper-plugins/csv-processor/build.gradle @@ -12,7 +12,7 @@ dependencies { implementation project(':data-prepper-api') implementation 'com.fasterxml.jackson.dataformat:jackson-dataformat-csv' implementation 'io.micrometer:micrometer-core' - implementation 'org.apache.parquet:parquet-common:1.14.0' + implementation libs.parquet.common implementation 'software.amazon.awssdk:s3' implementation 'software.amazon.awssdk:apache-client' testImplementation project(':data-prepper-plugins:log-generator-source') diff --git a/data-prepper-plugins/csv-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/csv/CsvProcessorConfig.java b/data-prepper-plugins/csv-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/csv/CsvProcessorConfig.java index ec5d685b7e..8c770b597a 100644 --- a/data-prepper-plugins/csv-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/csv/CsvProcessorConfig.java +++ b/data-prepper-plugins/csv-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/csv/CsvProcessorConfig.java @@ -6,6 +6,7 @@ package org.opensearch.dataprepper.plugins.processor.csv; import com.fasterxml.jackson.annotation.JsonProperty; +import com.fasterxml.jackson.annotation.JsonPropertyDescription; import jakarta.validation.constraints.AssertTrue; import java.util.List; @@ -20,24 +21,45 @@ public class CsvProcessorConfig { static final Boolean DEFAULT_DELETE_HEADERS = true; @JsonProperty("source") + @JsonPropertyDescription("The field in the event that will be parsed. Default value is `message`.") private String source = DEFAULT_SOURCE; @JsonProperty("delimiter") + @JsonPropertyDescription("The character separating each column. Default value is `,`.") private String delimiter = DEFAULT_DELIMITER; @JsonProperty("delete_header") + @JsonPropertyDescription("If specified, the event header (`column_names_source_key`) is deleted after the event " + + "is parsed. If there is no event header, no action is taken. Default value is true.") private Boolean deleteHeader = DEFAULT_DELETE_HEADERS; @JsonProperty("quote_character") + @JsonPropertyDescription("The character used as a text qualifier for a single column of data. " + + "Default value is `\"`.") private String quoteCharacter = DEFAULT_QUOTE_CHARACTER; @JsonProperty("column_names_source_key") + @JsonPropertyDescription("The field in the event that specifies the CSV column names, which will be " + + "automatically detected. If there need to be extra column names, the column names are automatically " + + "generated according to their index. If `column_names` is also defined, the header in " + + "`column_names_source_key` can also be used to generate the event fields. " + + "If too few columns are specified in this field, the remaining column names are automatically generated. " + + "If too many column names are specified in this field, the CSV processor omits the extra column names.") private String columnNamesSourceKey; @JsonProperty("column_names") + @JsonPropertyDescription("User-specified names for the CSV columns. " + + "Default value is `[column1, column2, ..., columnN]` if there are no columns of data in the CSV " + + "record and `column_names_source_key` is not defined. If `column_names_source_key` is defined, " + + "the header in `column_names_source_key` generates the event fields. If too few columns are specified " + + "in this field, the remaining column names are automatically generated. " + + "If too many column names are specified in this field, the CSV processor omits the extra column names.") private List columnNames; @JsonProperty("csv_when") + @JsonPropertyDescription("Allows you to specify a [conditional expression](https://opensearch.org/docs/latest/data-prepper/pipelines/expression-syntax/), " + + "such as `/some-key == \"test\"`, that will be evaluated to determine whether " + + "the processor should be applied to the event.") private String csvWhen; /** diff --git a/data-prepper-plugins/date-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/date/DateProcessorConfig.java b/data-prepper-plugins/date-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/date/DateProcessorConfig.java index a74b2e9d38..aed3a38674 100644 --- a/data-prepper-plugins/date-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/date/DateProcessorConfig.java +++ b/data-prepper-plugins/date-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/date/DateProcessorConfig.java @@ -7,6 +7,7 @@ import com.fasterxml.jackson.annotation.JsonIgnore; import com.fasterxml.jackson.annotation.JsonProperty; +import com.fasterxml.jackson.annotation.JsonPropertyDescription; import jakarta.validation.constraints.AssertTrue; import java.time.ZoneId; @@ -24,8 +25,16 @@ public class DateProcessorConfig { public static class DateMatch { @JsonProperty("key") + @JsonPropertyDescription("Represents the event key against which to match patterns. " + + "Required if `match` is configured. ") private String key; @JsonProperty("patterns") + @JsonPropertyDescription("A list of possible patterns that the timestamp value of the key can have. The patterns " + + "are based on a sequence of letters and symbols. The `patterns` support all the patterns listed in the " + + "Java [DatetimeFormatter](https://docs.oracle.com/javase/8/docs/api/java/time/format/DateTimeFormatter.html) reference. " + + "The timestamp value also supports `epoch_second`, `epoch_milli`, and `epoch_nano` values, " + + "which represent the timestamp as the number of seconds, milliseconds, and nanoseconds since the epoch. " + + "Epoch values always use the UTC time zone.") private List patterns; public DateMatch() { @@ -82,30 +91,57 @@ public static boolean isValidPattern(final String pattern) { } @JsonProperty("from_time_received") + @JsonPropertyDescription("When `true`, the timestamp from the event metadata, " + + "which is the time at which the source receives the event, is added to the event data. " + + "This option cannot be defined at the same time as `match`. Default is `false`.") private Boolean fromTimeReceived = DEFAULT_FROM_TIME_RECEIVED; @JsonProperty("to_origination_metadata") + @JsonPropertyDescription("When `true`, the matched time is also added to the event's metadata as an instance of " + + "`Instant`. Default is `false`.") private Boolean toOriginationMetadata = DEFAULT_TO_ORIGINATION_METADATA; @JsonProperty("match") + @JsonPropertyDescription("The date match configuration. " + + "This option cannot be defined at the same time as `from_time_received`. There is no default value.") private List match; @JsonProperty("destination") + @JsonPropertyDescription("The field used to store the timestamp parsed by the date processor. " + + "Can be used with both `match` and `from_time_received`. Default is `@timestamp`.") private String destination = DEFAULT_DESTINATION; @JsonProperty("output_format") + @JsonPropertyDescription("Determines the format of the timestamp added to an event. " + + "Default is `yyyy-MM-dd'T'HH:mm:ss.SSSXXX`.") private String outputFormat = DEFAULT_OUTPUT_FORMAT; @JsonProperty("source_timezone") + @JsonPropertyDescription("The time zone used to parse dates, including when the zone or offset cannot be extracted " + + "from the value. If the zone or offset are part of the value, then the time zone is ignored. " + + "A list of all the available time zones is contained in the **TZ database name** column of " + + "[the list of database time zones](https://en.wikipedia.org/wiki/List_of_tz_database_time_zones#List).") private String sourceTimezone = DEFAULT_SOURCE_TIMEZONE; @JsonProperty("destination_timezone") + @JsonPropertyDescription("The time zone used for storing the timestamp in the `destination` field. " + + "A list of all the available time zones is contained in the **TZ database name** column of " + + "[the list of database time zones](https://en.wikipedia.org/wiki/List_of_tz_database_time_zones#List).") private String destinationTimezone = DEFAULT_DESTINATION_TIMEZONE; @JsonProperty("locale") + @JsonPropertyDescription("The location used for parsing dates. Commonly used for parsing month names (`MMM`). " + + "The value can contain language, country, or variant fields in IETF BCP 47, such as `en-US`, " + + "or a string representation of the " + + "[locale](https://docs.oracle.com/javase/8/docs/api/java/util/Locale.html) object, such as `en_US`. " + + "A full list of locale fields, including language, country, and variant, can be found in " + + "[the language subtag registry](https://www.iana.org/assignments/language-subtag-registry/language-subtag-registry). " + + "Default is `Locale.ROOT`.") private String locale; @JsonProperty("date_when") + @JsonPropertyDescription("Specifies under what condition the `date` processor should perform matching. " + + "Default is no condition.") private String dateWhen; @JsonIgnore diff --git a/data-prepper-plugins/decompress-processor/build.gradle b/data-prepper-plugins/decompress-processor/build.gradle index 9d67cffc3b..1068830a59 100644 --- a/data-prepper-plugins/decompress-processor/build.gradle +++ b/data-prepper-plugins/decompress-processor/build.gradle @@ -9,5 +9,4 @@ dependencies { implementation project(':data-prepper-plugins:common') implementation 'com.fasterxml.jackson.core:jackson-databind' implementation 'io.micrometer:micrometer-core' - testImplementation testLibs.mockito.inline } \ No newline at end of file diff --git a/data-prepper-plugins/dynamodb-source-coordination-store/build.gradle b/data-prepper-plugins/dynamodb-source-coordination-store/build.gradle index 4b9fb2a8f4..1912c2ae9b 100644 --- a/data-prepper-plugins/dynamodb-source-coordination-store/build.gradle +++ b/data-prepper-plugins/dynamodb-source-coordination-store/build.gradle @@ -10,7 +10,6 @@ dependencies { implementation 'software.amazon.awssdk:dynamodb' implementation 'software.amazon.awssdk:dynamodb-enhanced' implementation 'software.amazon.awssdk:sts' - testImplementation testLibs.mockito.inline } test { diff --git a/data-prepper-plugins/dynamodb-source/build.gradle b/data-prepper-plugins/dynamodb-source/build.gradle index 8fdc037470..3b3046434a 100644 --- a/data-prepper-plugins/dynamodb-source/build.gradle +++ b/data-prepper-plugins/dynamodb-source/build.gradle @@ -25,6 +25,5 @@ dependencies { implementation project(path: ':data-prepper-plugins:buffer-common') - testImplementation testLibs.mockito.inline testImplementation 'com.fasterxml.jackson.dataformat:jackson-dataformat-yaml' } \ No newline at end of file diff --git a/data-prepper-plugins/event-json-codecs/build.gradle b/data-prepper-plugins/event-json-codecs/build.gradle index aad563d19d..2278bf6033 100644 --- a/data-prepper-plugins/event-json-codecs/build.gradle +++ b/data-prepper-plugins/event-json-codecs/build.gradle @@ -15,7 +15,7 @@ dependencies { implementation 'com.fasterxml.jackson.dataformat:jackson-dataformat-xml' implementation 'com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.0' testImplementation 'com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.0' - implementation 'org.apache.parquet:parquet-common:1.14.0' + implementation libs.parquet.common testImplementation project(':data-prepper-test-common') } diff --git a/data-prepper-plugins/event-json-codecs/src/test/java/org/opensearch/dataprepper/plugins/codec/event_json/EventJsonInputCodecTest.java b/data-prepper-plugins/event-json-codecs/src/test/java/org/opensearch/dataprepper/plugins/codec/event_json/EventJsonInputCodecTest.java index f85d1c6605..a4b0377963 100644 --- a/data-prepper-plugins/event-json-codecs/src/test/java/org/opensearch/dataprepper/plugins/codec/event_json/EventJsonInputCodecTest.java +++ b/data-prepper-plugins/event-json-codecs/src/test/java/org/opensearch/dataprepper/plugins/codec/event_json/EventJsonInputCodecTest.java @@ -11,9 +11,12 @@ import org.junit.jupiter.api.Test; import org.junit.jupiter.params.ParameterizedTest; import org.junit.jupiter.params.provider.ValueSource; + import static org.mockito.Mockito.when; import static org.mockito.Mockito.mock; + import org.mockito.Mock; + import static org.hamcrest.MatcherAssert.assertThat; import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.CoreMatchers.not; @@ -28,6 +31,7 @@ import java.io.ByteArrayInputStream; import java.time.Instant; +import java.time.temporal.ChronoUnit; import java.util.List; import java.util.LinkedList; import java.util.Map; @@ -56,7 +60,7 @@ public EventJsonInputCodec createInputCodec() { @ParameterizedTest @ValueSource(strings = {"", "{}"}) public void emptyTest(String input) throws Exception { - input = "{\""+EventJsonDefines.VERSION+"\":\""+DataPrepperVersion.getCurrentVersion().toString()+"\", \""+EventJsonDefines.EVENTS+"\":["+input+"]}"; + input = "{\"" + EventJsonDefines.VERSION + "\":\"" + DataPrepperVersion.getCurrentVersion().toString() + "\", \"" + EventJsonDefines.EVENTS + "\":[" + input + "]}"; ByteArrayInputStream inputStream = new ByteArrayInputStream(input.getBytes()); inputCodec = createInputCodec(); Consumer> consumer = mock(Consumer.class); @@ -70,15 +74,15 @@ public void inCompatibleVersionTest() throws Exception { final String key = UUID.randomUUID().toString(); final String value = UUID.randomUUID().toString(); Map data = Map.of(key, value); - Instant startTime = Instant.now(); + Instant startTime = Instant.now().truncatedTo(ChronoUnit.MICROS); Event event = createEvent(data, startTime); Map dataMap = event.toMap(); Map metadataMap = objectMapper.convertValue(event.getMetadata(), Map.class); - String input = "{\""+EventJsonDefines.VERSION+"\":\"3.0\", \""+EventJsonDefines.EVENTS+"\":["; + String input = "{\"" + EventJsonDefines.VERSION + "\":\"3.0\", \"" + EventJsonDefines.EVENTS + "\":["; String comma = ""; for (int i = 0; i < 2; i++) { - input += comma+"{\"data\":"+objectMapper.writeValueAsString(dataMap)+","+"\"metadata\":"+objectMapper.writeValueAsString(metadataMap)+"}"; + input += comma + "{\"data\":" + objectMapper.writeValueAsString(dataMap) + "," + "\"metadata\":" + objectMapper.writeValueAsString(metadataMap) + "}"; comma = ","; } input += "]}"; @@ -95,15 +99,15 @@ public void basicTest() throws Exception { final String key = UUID.randomUUID().toString(); final String value = UUID.randomUUID().toString(); Map data = Map.of(key, value); - Instant startTime = Instant.now(); + Instant startTime = Instant.now().truncatedTo(ChronoUnit.MICROS); Event event = createEvent(data, startTime); Map dataMap = event.toMap(); Map metadataMap = objectMapper.convertValue(event.getMetadata(), Map.class); - String input = "{\""+EventJsonDefines.VERSION+"\":\""+DataPrepperVersion.getCurrentVersion().toString()+"\", \""+EventJsonDefines.EVENTS+"\":["; + String input = "{\"" + EventJsonDefines.VERSION + "\":\"" + DataPrepperVersion.getCurrentVersion().toString() + "\", \"" + EventJsonDefines.EVENTS + "\":["; String comma = ""; for (int i = 0; i < 2; i++) { - input += comma+"{\"data\":"+objectMapper.writeValueAsString(dataMap)+","+"\"metadata\":"+objectMapper.writeValueAsString(metadataMap)+"}"; + input += comma + "{\"data\":" + objectMapper.writeValueAsString(dataMap) + "," + "\"metadata\":" + objectMapper.writeValueAsString(metadataMap) + "}"; comma = ","; } input += "]}"; @@ -111,8 +115,8 @@ public void basicTest() throws Exception { List> records = new LinkedList<>(); inputCodec.parse(inputStream, records::add); assertThat(records.size(), equalTo(2)); - for(Record record : records) { - Event e = (Event)record.getData(); + for (Record record : records) { + Event e = (Event) record.getData(); assertThat(e.get(key, String.class), equalTo(value)); assertThat(e.getMetadata().getTimeReceived(), equalTo(startTime)); assertThat(e.getMetadata().getTags().size(), equalTo(0)); @@ -126,15 +130,15 @@ public void test_with_timeReceivedOverridden() throws Exception { final String key = UUID.randomUUID().toString(); final String value = UUID.randomUUID().toString(); Map data = Map.of(key, value); - Instant startTime = Instant.now().minusSeconds(5); + Instant startTime = Instant.now().truncatedTo(ChronoUnit.MICROS).minusSeconds(5); Event event = createEvent(data, startTime); Map dataMap = event.toMap(); Map metadataMap = objectMapper.convertValue(event.getMetadata(), Map.class); - String input = "{\""+EventJsonDefines.VERSION+"\":\""+DataPrepperVersion.getCurrentVersion().toString()+"\", \""+EventJsonDefines.EVENTS+"\":["; + String input = "{\"" + EventJsonDefines.VERSION + "\":\"" + DataPrepperVersion.getCurrentVersion().toString() + "\", \"" + EventJsonDefines.EVENTS + "\":["; String comma = ""; for (int i = 0; i < 2; i++) { - input += comma+"{\"data\":"+objectMapper.writeValueAsString(dataMap)+","+"\"metadata\":"+objectMapper.writeValueAsString(metadataMap)+"}"; + input += comma + "{\"data\":" + objectMapper.writeValueAsString(dataMap) + "," + "\"metadata\":" + objectMapper.writeValueAsString(metadataMap) + "}"; comma = ","; } input += "]}"; @@ -142,8 +146,8 @@ public void test_with_timeReceivedOverridden() throws Exception { List> records = new LinkedList<>(); inputCodec.parse(inputStream, records::add); assertThat(records.size(), equalTo(2)); - for(Record record : records) { - Event e = (Event)record.getData(); + for (Record record : records) { + Event e = (Event) record.getData(); assertThat(e.get(key, String.class), equalTo(value)); assertThat(e.getMetadata().getTimeReceived(), not(equalTo(startTime))); assertThat(e.getMetadata().getTags().size(), equalTo(0)); @@ -159,7 +163,7 @@ private Event createEvent(final Map json, final Instant timeRece if (timeReceived != null) { logBuilder.withTimeReceived(timeReceived); } - final JacksonEvent event = (JacksonEvent)logBuilder.build(); + final JacksonEvent event = (JacksonEvent) logBuilder.build(); return event; } diff --git a/data-prepper-plugins/event-json-codecs/src/test/java/org/opensearch/dataprepper/plugins/codec/event_json/EventJsonInputOutputCodecTest.java b/data-prepper-plugins/event-json-codecs/src/test/java/org/opensearch/dataprepper/plugins/codec/event_json/EventJsonInputOutputCodecTest.java index 85e91e5a55..7ea8c49cd0 100644 --- a/data-prepper-plugins/event-json-codecs/src/test/java/org/opensearch/dataprepper/plugins/codec/event_json/EventJsonInputOutputCodecTest.java +++ b/data-prepper-plugins/event-json-codecs/src/test/java/org/opensearch/dataprepper/plugins/codec/event_json/EventJsonInputOutputCodecTest.java @@ -6,9 +6,12 @@ import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Test; + import static org.mockito.Mockito.when; import static org.mockito.Mockito.mock; + import org.mockito.Mock; + import static org.hamcrest.MatcherAssert.assertThat; import static org.hamcrest.Matchers.equalTo; @@ -22,6 +25,7 @@ import org.opensearch.dataprepper.model.log.JacksonLog; import java.time.Instant; +import java.time.temporal.ChronoUnit; import java.util.List; import java.util.LinkedList; import java.util.Map; @@ -64,7 +68,7 @@ public void basicTest() throws Exception { final String value = UUID.randomUUID().toString(); Map data = Map.of(key, value); - Instant startTime = Instant.now(); + Instant startTime = Instant.now().truncatedTo(ChronoUnit.MICROS); Event event = createEvent(data, startTime); outputCodec = createOutputCodec(); inputCodec = createInputCodec(); @@ -75,8 +79,8 @@ public void basicTest() throws Exception { inputCodec.parse(new ByteArrayInputStream(outputStream.toByteArray()), records::add); assertThat(records.size(), equalTo(1)); - for(Record record : records) { - Event e = (Event)record.getData(); + for (Record record : records) { + Event e = (Event) record.getData(); assertThat(e.get(key, String.class), equalTo(value)); assertThat(e.getMetadata().getTimeReceived(), equalTo(startTime)); assertThat(e.getMetadata().getTags().size(), equalTo(0)); @@ -90,7 +94,7 @@ public void multipleEventsTest() throws Exception { final String value = UUID.randomUUID().toString(); Map data = Map.of(key, value); - Instant startTime = Instant.now(); + Instant startTime = Instant.now().truncatedTo(ChronoUnit.MICROS); Event event = createEvent(data, startTime); outputCodec = createOutputCodec(); inputCodec = createInputCodec(); @@ -103,8 +107,8 @@ public void multipleEventsTest() throws Exception { inputCodec.parse(new ByteArrayInputStream(outputStream.toByteArray()), records::add); assertThat(records.size(), equalTo(3)); - for(Record record : records) { - Event e = (Event)record.getData(); + for (Record record : records) { + Event e = (Event) record.getData(); assertThat(e.get(key, String.class), equalTo(value)); assertThat(e.getMetadata().getTimeReceived(), equalTo(startTime)); assertThat(e.getMetadata().getTags().size(), equalTo(0)); @@ -122,7 +126,7 @@ public void extendedTest() throws Exception { Set tags = Set.of(UUID.randomUUID().toString(), UUID.randomUUID().toString()); List tagsList = tags.stream().collect(Collectors.toList()); - Instant startTime = Instant.now(); + Instant startTime = Instant.now().truncatedTo(ChronoUnit.MICROS); Event event = createEvent(data, startTime); Instant origTime = startTime.minusSeconds(5); event.getMetadata().setExternalOriginationTime(origTime); @@ -135,11 +139,11 @@ public void extendedTest() throws Exception { outputCodec.complete(outputStream); assertThat(outputCodec.getExtension(), equalTo(EventJsonOutputCodec.EVENT_JSON)); List> records = new LinkedList<>(); -inputCodec.parse(new ByteArrayInputStream(outputStream.toByteArray()), records::add); + inputCodec.parse(new ByteArrayInputStream(outputStream.toByteArray()), records::add); assertThat(records.size(), equalTo(1)); - for(Record record : records) { - Event e = (Event)record.getData(); + for (Record record : records) { + Event e = (Event) record.getData(); assertThat(e.get(key, String.class), equalTo(value)); assertThat(e.getMetadata().getTimeReceived(), equalTo(startTime)); assertThat(e.getMetadata().getTags(), equalTo(tags)); @@ -157,7 +161,7 @@ private Event createEvent(final Map json, final Instant timeRece if (timeReceived != null) { logBuilder.withTimeReceived(timeReceived); } - final JacksonEvent event = (JacksonEvent)logBuilder.build(); + final JacksonEvent event = (JacksonEvent) logBuilder.build(); return event; } diff --git a/data-prepper-plugins/event-json-codecs/src/test/java/org/opensearch/dataprepper/plugins/codec/event_json/EventJsonOutputCodecTest.java b/data-prepper-plugins/event-json-codecs/src/test/java/org/opensearch/dataprepper/plugins/codec/event_json/EventJsonOutputCodecTest.java index 51dda545cb..b32d2b62e9 100644 --- a/data-prepper-plugins/event-json-codecs/src/test/java/org/opensearch/dataprepper/plugins/codec/event_json/EventJsonOutputCodecTest.java +++ b/data-prepper-plugins/event-json-codecs/src/test/java/org/opensearch/dataprepper/plugins/codec/event_json/EventJsonOutputCodecTest.java @@ -11,6 +11,7 @@ import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Test; import org.mockito.Mock; + import static org.hamcrest.MatcherAssert.assertThat; import static org.hamcrest.Matchers.equalTo; @@ -22,6 +23,7 @@ import org.opensearch.dataprepper.model.log.JacksonLog; import java.time.Instant; +import java.time.temporal.ChronoUnit; import java.util.Map; import java.util.UUID; @@ -49,7 +51,7 @@ public void basicTest() throws Exception { final String value = UUID.randomUUID().toString(); Map data = Map.of(key, value); - Instant startTime = Instant.now(); + Instant startTime = Instant.now().truncatedTo(ChronoUnit.MICROS); Event event = createEvent(data, startTime); outputCodec = createOutputCodec(); outputCodec.start(outputStream, null, null); @@ -59,10 +61,10 @@ public void basicTest() throws Exception { Map dataMap = event.toMap(); Map metadataMap = objectMapper.convertValue(event.getMetadata(), Map.class); //String expectedOutput = "{\"version\":\""+DataPrepperVersion.getCurrentVersion().toString()+"\",\""+EventJsonDefines.EVENTS+"\":["; - String expectedOutput = "{\""+EventJsonDefines.VERSION+"\":\""+DataPrepperVersion.getCurrentVersion().toString()+"\",\""+EventJsonDefines.EVENTS+"\":["; + String expectedOutput = "{\"" + EventJsonDefines.VERSION + "\":\"" + DataPrepperVersion.getCurrentVersion().toString() + "\",\"" + EventJsonDefines.EVENTS + "\":["; String comma = ""; for (int i = 0; i < 2; i++) { - expectedOutput += comma+"{\""+EventJsonDefines.DATA+"\":"+objectMapper.writeValueAsString(dataMap)+","+"\""+EventJsonDefines.METADATA+"\":"+objectMapper.writeValueAsString(metadataMap)+"}"; + expectedOutput += comma + "{\"" + EventJsonDefines.DATA + "\":" + objectMapper.writeValueAsString(dataMap) + "," + "\"" + EventJsonDefines.METADATA + "\":" + objectMapper.writeValueAsString(metadataMap) + "}"; comma = ","; } expectedOutput += "]}"; @@ -78,7 +80,7 @@ private Event createEvent(final Map json, final Instant timeRece if (timeReceived != null) { logBuilder.withTimeReceived(timeReceived); } - final JacksonEvent event = (JacksonEvent)logBuilder.build(); + final JacksonEvent event = (JacksonEvent) logBuilder.build(); return event; } diff --git a/data-prepper-plugins/flatten-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/flatten/FlattenProcessor.java b/data-prepper-plugins/flatten-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/flatten/FlattenProcessor.java index 2a07fd6d99..9e3218be88 100644 --- a/data-prepper-plugins/flatten-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/flatten/FlattenProcessor.java +++ b/data-prepper-plugins/flatten-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/flatten/FlattenProcessor.java @@ -119,14 +119,15 @@ private Map removeListIndicesInKeys(final Map in final Map resultMap = new HashMap<>(); for (final Map.Entry entry : inputMap.entrySet()) { - final String keyWithoutIndices = removeListIndices(entry.getKey()); + final String keyWithoutIndices = removeListIndices(entry.getKey(), config.isRemoveBrackets()); addFieldsToMapWithMerge(keyWithoutIndices, entry.getValue(), resultMap); } return resultMap; } - private String removeListIndices(final String key) { - return key.replaceAll("\\[\\d+\\]", "[]"); + private String removeListIndices(final String key, final boolean removeBrackets) { + final String replacement = removeBrackets ? "" : "[]"; + return key.replaceAll("\\[\\d+\\]", replacement); } private void addFieldsToMapWithMerge(String key, Object value, Map map) { diff --git a/data-prepper-plugins/flatten-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/flatten/FlattenProcessorConfig.java b/data-prepper-plugins/flatten-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/flatten/FlattenProcessorConfig.java index 96c9d2e024..c1208f5f40 100644 --- a/data-prepper-plugins/flatten-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/flatten/FlattenProcessorConfig.java +++ b/data-prepper-plugins/flatten-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/flatten/FlattenProcessorConfig.java @@ -6,6 +6,8 @@ package org.opensearch.dataprepper.plugins.processor.flatten; import com.fasterxml.jackson.annotation.JsonProperty; + +import jakarta.validation.constraints.AssertTrue; import jakarta.validation.constraints.NotNull; import java.util.ArrayList; @@ -29,6 +31,9 @@ public class FlattenProcessorConfig { @JsonProperty("remove_list_indices") private boolean removeListIndices = false; + @JsonProperty("remove_brackets") + private boolean removeBrackets = false; + @JsonProperty("exclude_keys") private List excludeKeys = DEFAULT_EXCLUDE_KEYS; @@ -54,6 +59,10 @@ public boolean isRemoveListIndices() { return removeListIndices; } + public boolean isRemoveBrackets() { + return removeBrackets; + } + public List getExcludeKeys() { return excludeKeys; } @@ -65,4 +74,9 @@ public String getFlattenWhen() { public List getTagsOnFailure() { return tagsOnFailure; } + + @AssertTrue(message = "remove_brackets can not be true if remove_list_indices is false.") + boolean removeBracketsNotTrueWhenRemoveListIndicesFalse() { + return (!removeBrackets || removeListIndices); + } } diff --git a/data-prepper-plugins/flatten-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/flatten/FlattenProcessorConfigTest.java b/data-prepper-plugins/flatten-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/flatten/FlattenProcessorConfigTest.java index d11860df0e..960db201d5 100644 --- a/data-prepper-plugins/flatten-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/flatten/FlattenProcessorConfigTest.java +++ b/data-prepper-plugins/flatten-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/flatten/FlattenProcessorConfigTest.java @@ -20,7 +20,7 @@ void testDefaultConfig() { assertThat(FlattenProcessorConfig.getSource(), equalTo(null)); assertThat(FlattenProcessorConfig.getTarget(), equalTo(null)); assertThat(FlattenProcessorConfig.isRemoveListIndices(), equalTo(false)); - assertThat(FlattenProcessorConfig.isRemoveListIndices(), equalTo(false)); + assertThat(FlattenProcessorConfig.isRemoveBrackets(), equalTo(false)); assertThat(FlattenProcessorConfig.getFlattenWhen(), equalTo(null)); assertThat(FlattenProcessorConfig.getTagsOnFailure(), equalTo(null)); assertThat(FlattenProcessorConfig.getExcludeKeys(), equalTo(List.of())); diff --git a/data-prepper-plugins/flatten-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/flatten/FlattenProcessorTest.java b/data-prepper-plugins/flatten-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/flatten/FlattenProcessorTest.java index 737d245ff5..df693f7f6f 100644 --- a/data-prepper-plugins/flatten-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/flatten/FlattenProcessorTest.java +++ b/data-prepper-plugins/flatten-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/flatten/FlattenProcessorTest.java @@ -52,6 +52,7 @@ void setUp() { lenient().when(mockConfig.getTarget()).thenReturn(""); lenient().when(mockConfig.isRemoveProcessedFields()).thenReturn(false); lenient().when(mockConfig.isRemoveListIndices()).thenReturn(false); + lenient().when(mockConfig.isRemoveBrackets()).thenReturn(false); lenient().when(mockConfig.getFlattenWhen()).thenReturn(null); lenient().when(mockConfig.getTagsOnFailure()).thenReturn(new ArrayList<>()); lenient().when(mockConfig.getExcludeKeys()).thenReturn(new ArrayList<>()); @@ -119,6 +120,35 @@ void testFlattenEntireEventDataAndRemoveListIndices() { assertThat(resultData.get("list1[].list2[].value"), is(List.of("value1", "value2"))); } + @Test + void testFlattenEntireEventDataAndRemoveListIndicesAndRemoveBrackets() { + when(mockConfig.isRemoveListIndices()).thenReturn(true); + when(mockConfig.isRemoveBrackets()).thenReturn(true); + + final FlattenProcessor processor = createObjectUnderTest(); + final Record testRecord = createTestRecord(createTestData()); + final List> resultRecord = (List>) processor.doExecute(Collections.singletonList(testRecord)); + + assertThat(resultRecord.size(), is(1)); + + final Event resultEvent = resultRecord.get(0).getData(); + Map resultData = resultEvent.get("", Map.class); + + assertThat(resultData.containsKey("key1"), is(true)); + assertThat(resultData.get("key1"), is("val1")); + + assertThat(resultData.containsKey("key1"), is(true)); + assertThat(resultData.get("key2.key3.key.4"), is("val2")); + + assertThat(resultData.containsKey("list1[].list2[].name"), is(false)); + assertThat(resultData.containsKey("list1.list2.name"), is(true)); + assertThat(resultData.get("list1.list2.name"), is(List.of("name1", "name2"))); + + assertThat(resultData.containsKey("list1[].list2[].value"), is(false)); + assertThat(resultData.containsKey("list1.list2.value"), is(true)); + assertThat(resultData.get("list1.list2.value"), is(List.of("value1", "value2"))); + } + @Test void testFlattenWithSpecificFieldsAsSourceAndTarget() { when(mockConfig.getSource()).thenReturn(SOURCE_KEY); @@ -187,6 +217,37 @@ void testFlattenWithSpecificFieldsAsSourceAndTargetAndRemoveListIndices() { assertThat(resultData.get("list1[].list2[].value"), is(List.of("value1", "value2"))); } + @Test + void testFlattenWithSpecificFieldsAsSourceAndTargetAndRemoveListIndicesAndRemoveBrackets() { + when(mockConfig.getSource()).thenReturn(SOURCE_KEY); + when(mockConfig.getTarget()).thenReturn(TARGET_KEY); + when(mockConfig.isRemoveListIndices()).thenReturn(true); + when(mockConfig.isRemoveBrackets()).thenReturn(true); + + final FlattenProcessor processor = createObjectUnderTest(); + final Record testRecord = createTestRecord(Map.of(SOURCE_KEY, createTestData())); + final List> resultRecord = (List>) processor.doExecute(Collections.singletonList(testRecord)); + + assertThat(resultRecord.size(), is(1)); + + final Event resultEvent = resultRecord.get(0).getData(); + Map resultData = resultEvent.get(TARGET_KEY, Map.class); + + assertThat(resultData.containsKey("key1"), is(true)); + assertThat(resultData.get("key1"), is("val1")); + + assertThat(resultData.containsKey("key1"), is(true)); + assertThat(resultData.get("key2.key3.key.4"), is("val2")); + + assertThat(resultData.containsKey("list1[].list2[].name"), is(false)); + assertThat(resultData.containsKey("list1.list2.name"), is(true)); + assertThat(resultData.get("list1.list2.name"), is(List.of("name1", "name2"))); + + assertThat(resultData.containsKey("list1[].list2[].value"), is(false)); + assertThat(resultData.containsKey("list1.list2.value"), is(true)); + assertThat(resultData.get("list1.list2.value"), is(List.of("value1", "value2"))); + } + @Test public void testEventNotProcessedWhenTheWhenConditionIsFalse() { final String whenCondition = UUID.randomUUID().toString(); diff --git a/data-prepper-plugins/geoip-processor/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker b/data-prepper-plugins/geoip-processor/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker deleted file mode 100644 index 23c33feb6d..0000000000 --- a/data-prepper-plugins/geoip-processor/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker +++ /dev/null @@ -1,3 +0,0 @@ -# To enable mocking of final classes with vanilla Mockito -# https://github.com/mockito/mockito/wiki/What%27s-new-in-Mockito-2#mock-the-unmockable-opt-in-mocking-of-final-classesmethods -mock-maker-inline diff --git a/data-prepper-plugins/grok-processor/build.gradle b/data-prepper-plugins/grok-processor/build.gradle index 82a8306a5d..ae4a82a0ee 100644 --- a/data-prepper-plugins/grok-processor/build.gradle +++ b/data-prepper-plugins/grok-processor/build.gradle @@ -12,7 +12,6 @@ dependencies { implementation 'com.fasterxml.jackson.core:jackson-databind' implementation "io.krakens:java-grok:0.1.9" implementation 'io.micrometer:micrometer-core' - testImplementation testLibs.mockito.inline testImplementation project(':data-prepper-test-common') } diff --git a/data-prepper-plugins/grok-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/grok/GrokProcessor.java b/data-prepper-plugins/grok-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/grok/GrokProcessor.java index 8b8b7f2e90..8cc9c6a716 100644 --- a/data-prepper-plugins/grok-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/grok/GrokProcessor.java +++ b/data-prepper-plugins/grok-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/grok/GrokProcessor.java @@ -12,10 +12,10 @@ import io.micrometer.core.instrument.Counter; import io.micrometer.core.instrument.Timer; import org.opensearch.dataprepper.expression.ExpressionEvaluator; +import org.opensearch.dataprepper.metrics.PluginMetrics; import org.opensearch.dataprepper.model.annotations.DataPrepperPlugin; import org.opensearch.dataprepper.model.annotations.DataPrepperPluginConstructor; import org.opensearch.dataprepper.model.annotations.SingleThread; -import org.opensearch.dataprepper.model.configuration.PluginSetting; import org.opensearch.dataprepper.model.event.Event; import org.opensearch.dataprepper.model.processor.AbstractProcessor; import org.opensearch.dataprepper.model.processor.Processor; @@ -59,7 +59,7 @@ @SingleThread -@DataPrepperPlugin(name = "grok", pluginType = Processor.class) +@DataPrepperPlugin(name = "grok", pluginType = Processor.class, pluginConfigurationType = GrokProcessorConfig.class) public class GrokProcessor extends AbstractProcessor, Record> { static final long EXECUTOR_SERVICE_SHUTDOWN_TIMEOUT = 300L; @@ -89,20 +89,28 @@ public class GrokProcessor extends AbstractProcessor, Record(grokProcessorConfig.getkeysToOverwrite()); this.grokCompiler = grokCompiler; this.fieldToGrok = new LinkedHashMap<>(); this.executorService = executorService; this.expressionEvaluator = expressionEvaluator; this.tagsOnMatchFailure = grokProcessorConfig.getTagsOnMatchFailure(); - this.tagsOnTimeout = grokProcessorConfig.getTagsOnTimeout(); + this.tagsOnTimeout = grokProcessorConfig.getTagsOnTimeout().isEmpty() ? + grokProcessorConfig.getTagsOnMatchFailure() : grokProcessorConfig.getTagsOnTimeout(); grokProcessingMatchCounter = pluginMetrics.counter(GROK_PROCESSING_MATCH); grokProcessingMismatchCounter = pluginMetrics.counter(GROK_PROCESSING_MISMATCH); grokProcessingErrorsCounter = pluginMetrics.counter(GROK_PROCESSING_ERRORS); diff --git a/data-prepper-plugins/grok-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/grok/GrokProcessorConfig.java b/data-prepper-plugins/grok-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/grok/GrokProcessorConfig.java index de9daf91d5..2d2ae1ef41 100644 --- a/data-prepper-plugins/grok-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/grok/GrokProcessorConfig.java +++ b/data-prepper-plugins/grok-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/grok/GrokProcessorConfig.java @@ -5,8 +5,10 @@ package org.opensearch.dataprepper.plugins.processor.grok; -import org.opensearch.dataprepper.model.configuration.PluginSetting; +import com.fasterxml.jackson.annotation.JsonProperty; +import com.fasterxml.jackson.annotation.JsonPropertyDescription; +import java.util.Collections; import java.util.List; import java.util.Map; @@ -39,69 +41,57 @@ public class GrokProcessorConfig { static final int DEFAULT_TIMEOUT_MILLIS = 30000; static final String DEFAULT_TARGET_KEY = null; - private final boolean breakOnMatch; - private final boolean keepEmptyCaptures; - private final Map> match; - private final boolean namedCapturesOnly; - private final List keysToOverwrite; - private final List patternsDirectories; - private final String patternsFilesGlob; - private final Map patternDefinitions; - private final int timeoutMillis; - private final String targetKey; - private final String grokWhen; - private final List tagsOnMatchFailure; - private final List tagsOnTimeout; - - private final boolean includePerformanceMetadata; - - private GrokProcessorConfig(final boolean breakOnMatch, - final boolean keepEmptyCaptures, - final Map> match, - final boolean namedCapturesOnly, - final List keysToOverwrite, - final List patternsDirectories, - final String patternsFilesGlob, - final Map patternDefinitions, - final int timeoutMillis, - final String targetKey, - final String grokWhen, - final List tagsOnMatchFailure, - final List tagsOnTimeout, - final boolean includePerformanceMetadata) { - - this.breakOnMatch = breakOnMatch; - this.keepEmptyCaptures = keepEmptyCaptures; - this.match = match; - this.namedCapturesOnly = namedCapturesOnly; - this.keysToOverwrite = keysToOverwrite; - this.patternsDirectories = patternsDirectories; - this.patternsFilesGlob = patternsFilesGlob; - this.patternDefinitions = patternDefinitions; - this.timeoutMillis = timeoutMillis; - this.targetKey = targetKey; - this.grokWhen = grokWhen; - this.tagsOnMatchFailure = tagsOnMatchFailure; - this.tagsOnTimeout = tagsOnTimeout.isEmpty() ? tagsOnMatchFailure : tagsOnTimeout; - this.includePerformanceMetadata = includePerformanceMetadata; - } - - public static GrokProcessorConfig buildConfig(final PluginSetting pluginSetting) { - return new GrokProcessorConfig(pluginSetting.getBooleanOrDefault(BREAK_ON_MATCH, DEFAULT_BREAK_ON_MATCH), - pluginSetting.getBooleanOrDefault(KEEP_EMPTY_CAPTURES, DEFAULT_KEEP_EMPTY_CAPTURES), - pluginSetting.getTypedListMap(MATCH, String.class, String.class), - pluginSetting.getBooleanOrDefault(NAMED_CAPTURES_ONLY, DEFAULT_NAMED_CAPTURES_ONLY), - pluginSetting.getTypedList(KEYS_TO_OVERWRITE, String.class), - pluginSetting.getTypedList(PATTERNS_DIRECTORIES, String.class), - pluginSetting.getStringOrDefault(PATTERNS_FILES_GLOB, DEFAULT_PATTERNS_FILES_GLOB), - pluginSetting.getTypedMap(PATTERN_DEFINITIONS, String.class, String.class), - pluginSetting.getIntegerOrDefault(TIMEOUT_MILLIS, DEFAULT_TIMEOUT_MILLIS), - pluginSetting.getStringOrDefault(TARGET_KEY, DEFAULT_TARGET_KEY), - pluginSetting.getStringOrDefault(GROK_WHEN, null), - pluginSetting.getTypedList(TAGS_ON_MATCH_FAILURE, String.class), - pluginSetting.getTypedList(TAGS_ON_TIMEOUT, String.class), - pluginSetting.getBooleanOrDefault(INCLUDE_PERFORMANCE_METADATA, false)); - } + @JsonProperty(BREAK_ON_MATCH) + @JsonPropertyDescription("Specifies whether to match all patterns (`false`) or stop once the first successful " + + "match is found (`true`). Default is `true`.") + private boolean breakOnMatch = DEFAULT_BREAK_ON_MATCH; + @JsonProperty(KEEP_EMPTY_CAPTURES) + @JsonPropertyDescription("Enables the preservation of `null` captures from the processed output. Default is `false`.") + private boolean keepEmptyCaptures = DEFAULT_KEEP_EMPTY_CAPTURES; + @JsonProperty(MATCH) + @JsonPropertyDescription("Specifies which keys should match specific patterns. Default is an empty response body.") + private Map> match = Collections.emptyMap(); + @JsonProperty(NAMED_CAPTURES_ONLY) + @JsonPropertyDescription("Specifies whether to keep only named captures. Default is `true`.") + private boolean namedCapturesOnly = DEFAULT_NAMED_CAPTURES_ONLY; + @JsonProperty(KEYS_TO_OVERWRITE) + @JsonPropertyDescription("Specifies which existing keys will be overwritten if there is a capture with the same key value. " + + "Default is `[]`.") + private List keysToOverwrite = Collections.emptyList(); + @JsonProperty(PATTERNS_DIRECTORIES) + @JsonPropertyDescription("Specifies which directory paths contain the custom pattern files. Default is an empty list.") + private List patternsDirectories = Collections.emptyList(); + @JsonProperty(PATTERNS_FILES_GLOB) + @JsonPropertyDescription("Specifies which pattern files to use from the directories specified for " + + "`pattern_directories`. Default is `*`.") + private String patternsFilesGlob = DEFAULT_PATTERNS_FILES_GLOB; + @JsonProperty(PATTERN_DEFINITIONS) + @JsonPropertyDescription("Allows for a custom pattern that can be used inline inside the response body. " + + "Default is an empty response body.") + private Map patternDefinitions = Collections.emptyMap(); + @JsonProperty(TIMEOUT_MILLIS) + @JsonPropertyDescription("The maximum amount of time during which matching occurs. " + + "Setting to `0` prevents any matching from occurring. Default is `30,000`.") + private int timeoutMillis = DEFAULT_TIMEOUT_MILLIS; + @JsonProperty(TARGET_KEY) + @JsonPropertyDescription("Specifies a parent-level key used to store all captures. Default value is `null`.") + private String targetKey = DEFAULT_TARGET_KEY; + @JsonProperty(GROK_WHEN) + @JsonPropertyDescription("Specifies under what condition the `grok` processor should perform matching. " + + "Default is no condition.") + private String grokWhen; + @JsonProperty(TAGS_ON_MATCH_FAILURE) + @JsonPropertyDescription("A `List` of `String`s that specifies the tags to be set in the event when grok fails to " + + "match or an unknown exception occurs while matching. This tag may be used in conditional expressions in " + + "other parts of the configuration") + private List tagsOnMatchFailure = Collections.emptyList(); + @JsonProperty(TAGS_ON_TIMEOUT) + @JsonPropertyDescription("A `List` of `String`s that specifies the tags to be set in the event when grok match times out.") + private List tagsOnTimeout = Collections.emptyList(); + @JsonProperty(INCLUDE_PERFORMANCE_METADATA) + @JsonPropertyDescription("A `Boolean` on whether to include performance metadata into event metadata, " + + "e.g. _total_grok_patterns_attempted, _total_grok_processing_time.") + private boolean includePerformanceMetadata = false; public boolean isBreakOnMatch() { return breakOnMatch; diff --git a/data-prepper-plugins/grok-processor/src/main/resources/grok-patterns/patterns b/data-prepper-plugins/grok-processor/src/main/resources/grok-patterns/patterns index bb433620d7..b5d14ae632 100644 --- a/data-prepper-plugins/grok-processor/src/main/resources/grok-patterns/patterns +++ b/data-prepper-plugins/grok-processor/src/main/resources/grok-patterns/patterns @@ -14,6 +14,6 @@ ELB_ACCESS_LOG %{TIMESTAMP_ISO8601:timestamp}\s%{NOTSPACE:elb}\s%{IP:clientip}:% S3_HTTP_REQUEST ((?:%{WORD:verb}\s%{NOTSPACE:request}\s(?:HTTP/%{NUMBER:httpversion}))?|%{DATA:rawrequest}) S3_ACCESS_LOG %{WORD:owner}\s%{NOTSPACE:bucket}\s\[%{HTTPDATE:timestamp}\]\s%{IP:clientip}\s%{NOTSPACE:requester}\s%{NOTSPACE:request_id}\s%{NOTSPACE:operation}\s%{NOTSPACE:key}\s(?:-|"%{S3_HTTP_REQUEST}")\s(?:-|%{INT:response:int})\s(?:-|%{NOTSPACE:error_code})\s(?:-|%{INT:bytes_sent:int})\s(?:-|%{INT:object_size:int})\s(?:-|%{INT:request_time_ms:int})\s(?:-|%{INT:turnaround_time_ms:int})\s(?:%{QS:referrer})\s(?:-|"?%{QS:agent}"?)\s(?:-|%{NOTSPACE:version_id}) -CLOUDFRONT_ACCESS_LOG (?%{YEAR}-%{MONTHNUM}-%{MONTHDAY}\s%{TIME})\s%{NOTSPACE:x_edge_location}\s(?:-|%{NUMBER:sc_bytes:int})\s%{IPORHOST:clientip}\s%{WORD:cs_method}\s%{HOSTNAME:cs_host}\s%{NOTSPACE:cs_uri_stem}\s(?:-|%{NUMBER:sc_status:int})\s%{GREEDYDATA:referrer}\s%{GREEDYDATA:agent}\s%{GREEDYDATA:cs_uri_query}\s%{GREEDYDATA:cookies}\s%{WORD:x_edge_result_type}\s%{NOTSPACE:x_edge_request_id}\s%{HOSTNAME:x_host_header}\s%{URIPROTO:cs_protocol}\s(?:-|%{INT:cs_bytes:int})\s(?:-|%{GREEDYDATA:time_taken:float}\s%{GREEDYDATA:x_forwarded_for}\s%{GREEDYDATA:ssl_protocol}\s%{GREEDYDATA:ssl_cipher}\s%{GREEDYDATA:x_edge_response_result_type} +CLOUDFRONT_ACCESS_LOG (?%{YEAR}-%{MONTHNUM}-%{MONTHDAY}\s%{TIME})\s%{NOTSPACE:x_edge_location}\s(?:-|%{NUMBER:sc_bytes:int})\s%{IPORHOST:clientip}\s%{WORD:cs_method}\s%{HOSTNAME:cs_host}\s%{NOTSPACE:cs_uri_stem}\s(?:-|%{NUMBER:sc_status:int})\s%{GREEDYDATA:referrer}\s%{GREEDYDATA:agent}\s%{GREEDYDATA:cs_uri_query}\s%{GREEDYDATA:cookies}\s%{WORD:x_edge_result_type}\s%{NOTSPACE:x_edge_request_id}\s%{HOSTNAME:x_host_header}\s%{URIPROTO:cs_protocol}\s(?:-|%{INT:cs_bytes:int})\s(?:-|%{GREEDYDATA:time_taken:float})\s%{GREEDYDATA:x_forwarded_for}\s%{GREEDYDATA:ssl_protocol}\s%{GREEDYDATA:ssl_cipher}\s%{GREEDYDATA:x_edge_response_result_type} -VPC_FLOW_LOG %{NUMBER:version}\s%{NUMBER:account-id}\s%{NOTSPACE:interface-id}\s%{NOTSPACE:srcaddr}\s%{NOTSPACE:dstaddr}\s(?:-|%{NOTSPACE:srcport:int})\s(?:-|%{NOTSPACE:dstport:int})\s(?:-|%{NOTSPACE:protocol:int})\s(?:-|%{NOTSPACE:packets:int})\s(?:-|%{NOTSPACE:bytes:int})\s(?:-|%{NUMBER:start:int})\s(?:-|%{NUMBER:end:int})\s%{NOTSPACE:action}\s%{NOTSPACE:log-status} \ No newline at end of file +VPC_FLOW_LOG %{NUMBER:version}\s%{NUMBER:account-id}\s%{NOTSPACE:interface-id}\s%{NOTSPACE:srcaddr}\s%{NOTSPACE:dstaddr}\s(?:-|%{NOTSPACE:srcport:int})\s(?:-|%{NOTSPACE:dstport:int})\s(?:-|%{NOTSPACE:protocol:int})\s(?:-|%{NOTSPACE:packets:int})\s(?:-|%{NOTSPACE:bytes:int})\s(?:-|%{NUMBER:start:int})\s(?:-|%{NUMBER:end:int})\s%{NOTSPACE:action}\s%{NOTSPACE:log-status} diff --git a/data-prepper-plugins/grok-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/grok/GrokProcessorConfigTests.java b/data-prepper-plugins/grok-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/grok/GrokProcessorConfigTests.java index eb69968a96..37c5ec9cb1 100644 --- a/data-prepper-plugins/grok-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/grok/GrokProcessorConfigTests.java +++ b/data-prepper-plugins/grok-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/grok/GrokProcessorConfigTests.java @@ -5,6 +5,7 @@ package org.opensearch.dataprepper.plugins.processor.grok; +import com.fasterxml.jackson.databind.ObjectMapper; import org.opensearch.dataprepper.model.configuration.PluginSetting; import org.junit.jupiter.api.BeforeAll; import org.junit.jupiter.api.Test; @@ -27,6 +28,7 @@ import static org.opensearch.dataprepper.plugins.processor.grok.GrokProcessorConfig.DEFAULT_TIMEOUT_MILLIS; public class GrokProcessorConfigTests { + private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper(); private static final String PLUGIN_NAME = "grok"; private static final Map> TEST_MATCH = new HashMap<>(); @@ -62,7 +64,8 @@ public static void setUp() { @Test public void testDefault() { - final GrokProcessorConfig grokProcessorConfig = GrokProcessorConfig.buildConfig(new PluginSetting(PLUGIN_NAME, null)); + final GrokProcessorConfig grokProcessorConfig = OBJECT_MAPPER.convertValue( + Collections.emptyMap(), GrokProcessorConfig.class); assertThat(grokProcessorConfig.isBreakOnMatch(), equalTo(DEFAULT_BREAK_ON_MATCH)); assertThat(grokProcessorConfig.isKeepEmptyCaptures(), equalTo(DEFAULT_KEEP_EMPTY_CAPTURES)); @@ -95,7 +98,8 @@ public void testValidConfig() { TEST_TARGET_KEY, true); - final GrokProcessorConfig grokProcessorConfig = GrokProcessorConfig.buildConfig(validPluginSetting); + final GrokProcessorConfig grokProcessorConfig = OBJECT_MAPPER.convertValue( + validPluginSetting.getSettings(), GrokProcessorConfig.class); assertThat(grokProcessorConfig.isBreakOnMatch(), equalTo(false)); assertThat(grokProcessorConfig.isKeepEmptyCaptures(), equalTo(true)); @@ -127,7 +131,8 @@ public void testInvalidConfig() { invalidPluginSetting.getSettings().put(GrokProcessorConfig.MATCH, TEST_INVALID_MATCH); - assertThrows(IllegalArgumentException.class, () -> GrokProcessorConfig.buildConfig(invalidPluginSetting)); + assertThrows(IllegalArgumentException.class, () -> OBJECT_MAPPER.convertValue( + invalidPluginSetting.getSettings(), GrokProcessorConfig.class)); } private PluginSetting completePluginSettingForGrokProcessor(final boolean breakOnMatch, @@ -160,33 +165,22 @@ private PluginSetting completePluginSettingForGrokProcessor(final boolean breakO @Test void getTagsOnMatchFailure_returns_tagOnMatch() { final List tagsOnMatch = List.of(UUID.randomUUID().toString(), UUID.randomUUID().toString()); - final GrokProcessorConfig objectUnderTest = GrokProcessorConfig.buildConfig(new PluginSetting(PLUGIN_NAME, - Map.of(GrokProcessorConfig.TAGS_ON_MATCH_FAILURE, tagsOnMatch) - )); + final GrokProcessorConfig objectUnderTest = OBJECT_MAPPER.convertValue( + Map.of(GrokProcessorConfig.TAGS_ON_MATCH_FAILURE, tagsOnMatch), GrokProcessorConfig.class); assertThat(objectUnderTest.getTagsOnMatchFailure(), equalTo(tagsOnMatch)); } - @Test - void getTagsOnTimeout_returns_tagsOnMatch_if_no_tagsOnTimeout() { - final List tagsOnMatch = List.of(UUID.randomUUID().toString(), UUID.randomUUID().toString()); - final GrokProcessorConfig objectUnderTest = GrokProcessorConfig.buildConfig(new PluginSetting(PLUGIN_NAME, - Map.of(GrokProcessorConfig.TAGS_ON_MATCH_FAILURE, tagsOnMatch) - )); - - assertThat(objectUnderTest.getTagsOnTimeout(), equalTo(tagsOnMatch)); - } - @Test void getTagsOnTimeout_returns_tagsOnTimeout_if_present() { final List tagsOnMatch = List.of(UUID.randomUUID().toString(), UUID.randomUUID().toString()); final List tagsOnTimeout = List.of(UUID.randomUUID().toString(), UUID.randomUUID().toString()); - final GrokProcessorConfig objectUnderTest = GrokProcessorConfig.buildConfig(new PluginSetting(PLUGIN_NAME, + final GrokProcessorConfig objectUnderTest = OBJECT_MAPPER.convertValue( Map.of( GrokProcessorConfig.TAGS_ON_MATCH_FAILURE, tagsOnMatch, GrokProcessorConfig.TAGS_ON_TIMEOUT, tagsOnTimeout - ) - )); + ), + GrokProcessorConfig.class); assertThat(objectUnderTest.getTagsOnTimeout(), equalTo(tagsOnTimeout)); } @@ -194,9 +188,8 @@ void getTagsOnTimeout_returns_tagsOnTimeout_if_present() { @Test void getTagsOnTimeout_returns_tagsOnTimeout_if_present_and_no_tagsOnMatch() { final List tagsOnTimeout = List.of(UUID.randomUUID().toString(), UUID.randomUUID().toString()); - final GrokProcessorConfig objectUnderTest = GrokProcessorConfig.buildConfig(new PluginSetting(PLUGIN_NAME, - Map.of(GrokProcessorConfig.TAGS_ON_TIMEOUT, tagsOnTimeout) - )); + final GrokProcessorConfig objectUnderTest = OBJECT_MAPPER.convertValue( + Map.of(GrokProcessorConfig.TAGS_ON_TIMEOUT, tagsOnTimeout), GrokProcessorConfig.class); assertThat(objectUnderTest.getTagsOnTimeout(), equalTo(tagsOnTimeout)); } diff --git a/data-prepper-plugins/grok-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/grok/GrokProcessorIT.java b/data-prepper-plugins/grok-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/grok/GrokProcessorIT.java index 1c8d0036c2..f6fa090405 100644 --- a/data-prepper-plugins/grok-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/grok/GrokProcessorIT.java +++ b/data-prepper-plugins/grok-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/grok/GrokProcessorIT.java @@ -16,6 +16,7 @@ import org.junit.jupiter.params.provider.MethodSource; import org.mockito.Mock; import org.opensearch.dataprepper.expression.ExpressionEvaluator; +import org.opensearch.dataprepper.metrics.PluginMetrics; import org.opensearch.dataprepper.model.configuration.PluginSetting; import org.opensearch.dataprepper.model.event.Event; import org.opensearch.dataprepper.model.record.Record; @@ -38,6 +39,8 @@ public class GrokProcessorIT { private PluginSetting pluginSetting; + private PluginMetrics pluginMetrics; + private GrokProcessorConfig grokProcessorConfig; private GrokProcessor grokProcessor; private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper(); private static final TypeReference> MAP_TYPE_REFERENCE = new TypeReference>() {}; @@ -65,6 +68,8 @@ public void setup() { null); pluginSetting.setPipelineName("grokPipeline"); + grokProcessorConfig = OBJECT_MAPPER.convertValue(pluginSetting.getSettings(), GrokProcessorConfig.class); + pluginMetrics = PluginMetrics.fromPluginSetting(pluginSetting); // This is a COMMONAPACHELOG pattern with the following format // COMMONAPACHELOG %{IPORHOST:clientip} %{USER:ident} %{USER:auth} \[%{HTTPDATE:timestamp}\] "(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})" %{NUMBER:response} (?:%{NUMBER:bytes}|-) @@ -115,7 +120,8 @@ public void testMatchNoCapturesWithExistingAndNonExistingKey() throws JsonProces matchConfig.put("bad_key", Collections.singletonList(nonMatchingPattern)); pluginSetting.getSettings().put(GrokProcessorConfig.MATCH, matchConfig); - grokProcessor = new GrokProcessor(pluginSetting, expressionEvaluator); + grokProcessorConfig = OBJECT_MAPPER.convertValue(pluginSetting.getSettings(), GrokProcessorConfig.class); + grokProcessor = new GrokProcessor(pluginMetrics, grokProcessorConfig, expressionEvaluator); final Map testData = new HashMap(); testData.put("message", messageInput); @@ -135,7 +141,8 @@ public void testSingleMatchSinglePatternWithDefaults() throws JsonProcessingExce matchConfig.put("message", Collections.singletonList("%{COMMONAPACHELOG}")); pluginSetting.getSettings().put(GrokProcessorConfig.MATCH, matchConfig); - grokProcessor = new GrokProcessor(pluginSetting, expressionEvaluator); + grokProcessorConfig = OBJECT_MAPPER.convertValue(pluginSetting.getSettings(), GrokProcessorConfig.class); + grokProcessor = new GrokProcessor(pluginMetrics, grokProcessorConfig, expressionEvaluator); final Map testData = new HashMap(); testData.put("message", messageInput); @@ -173,7 +180,8 @@ public void testSingleMatchMultiplePatternWithBreakOnMatchFalse() throws JsonPro pluginSetting.getSettings().put(GrokProcessorConfig.MATCH, matchConfig); pluginSetting.getSettings().put(GrokProcessorConfig.BREAK_ON_MATCH, false); - grokProcessor = new GrokProcessor(pluginSetting, expressionEvaluator); + grokProcessorConfig = OBJECT_MAPPER.convertValue(pluginSetting.getSettings(), GrokProcessorConfig.class); + grokProcessor = new GrokProcessor(pluginMetrics, grokProcessorConfig, expressionEvaluator); final Map testData = new HashMap(); testData.put("message", messageInput); @@ -208,7 +216,8 @@ public void testSingleMatchTypeConversionWithDefaults() throws JsonProcessingExc matchConfig.put("message", Collections.singletonList("\"(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})\" %{NUMBER:response:int} (?:%{NUMBER:bytes:float}|-)")); pluginSetting.getSettings().put(GrokProcessorConfig.MATCH, matchConfig); - grokProcessor = new GrokProcessor(pluginSetting, expressionEvaluator); + grokProcessorConfig = OBJECT_MAPPER.convertValue(pluginSetting.getSettings(), GrokProcessorConfig.class); + grokProcessor = new GrokProcessor(pluginMetrics, grokProcessorConfig, expressionEvaluator); final Map testData = new HashMap(); testData.put("message", messageInput); @@ -240,7 +249,8 @@ public void testMultipleMatchWithBreakOnMatchFalse() throws JsonProcessingExcept pluginSetting.getSettings().put(GrokProcessorConfig.MATCH, matchConfig); pluginSetting.getSettings().put(GrokProcessorConfig.BREAK_ON_MATCH, false); - grokProcessor = new GrokProcessor(pluginSetting, expressionEvaluator); + grokProcessorConfig = OBJECT_MAPPER.convertValue(pluginSetting.getSettings(), GrokProcessorConfig.class); + grokProcessor = new GrokProcessor(pluginMetrics, grokProcessorConfig, expressionEvaluator); final Map testData = new HashMap(); testData.put("message", messageInput); @@ -278,7 +288,8 @@ public void testMatchWithKeepEmptyCapturesTrue() throws JsonProcessingException pluginSetting.getSettings().put(GrokProcessorConfig.MATCH, matchConfig); pluginSetting.getSettings().put(GrokProcessorConfig.KEEP_EMPTY_CAPTURES, true); - grokProcessor = new GrokProcessor(pluginSetting, expressionEvaluator); + grokProcessorConfig = OBJECT_MAPPER.convertValue(pluginSetting.getSettings(), GrokProcessorConfig.class); + grokProcessor = new GrokProcessor(pluginMetrics, grokProcessorConfig, expressionEvaluator); final Map testData = new HashMap(); testData.put("message", messageInput); @@ -314,7 +325,8 @@ public void testMatchWithNamedCapturesOnlyFalse() throws JsonProcessingException pluginSetting.getSettings().put(GrokProcessorConfig.MATCH, matchConfig); pluginSetting.getSettings().put(GrokProcessorConfig.NAMED_CAPTURES_ONLY, false); - grokProcessor = new GrokProcessor(pluginSetting, expressionEvaluator); + grokProcessorConfig = OBJECT_MAPPER.convertValue(pluginSetting.getSettings(), GrokProcessorConfig.class); + grokProcessor = new GrokProcessor(pluginMetrics, grokProcessorConfig, expressionEvaluator); final Map testData = new HashMap(); testData.put("message", "This is my greedy data before matching 192.0.2.1 123456"); @@ -346,7 +358,8 @@ public void testPatternDefinitions() throws JsonProcessingException { pluginSetting.getSettings().put(GrokProcessorConfig.MATCH, matchConfig); pluginSetting.getSettings().put(GrokProcessorConfig.PATTERN_DEFINITIONS, patternDefinitions); - grokProcessor = new GrokProcessor(pluginSetting, expressionEvaluator); + grokProcessorConfig = OBJECT_MAPPER.convertValue(pluginSetting.getSettings(), GrokProcessorConfig.class); + grokProcessor = new GrokProcessor(pluginMetrics, grokProcessorConfig, expressionEvaluator); final Map testData = new HashMap(); testData.put("message", "This is my greedy data before matching with my phone number 123-456-789"); @@ -389,7 +402,8 @@ public void testPatternsDirWithDefaultPatternsFilesGlob() throws JsonProcessingE pluginSetting.getSettings().put(GrokProcessorConfig.MATCH, matchConfig); pluginSetting.getSettings().put(GrokProcessorConfig.PATTERNS_DIRECTORIES, patternsDirectories); - grokProcessor = new GrokProcessor(pluginSetting, expressionEvaluator); + grokProcessorConfig = OBJECT_MAPPER.convertValue(pluginSetting.getSettings(), GrokProcessorConfig.class); + grokProcessor = new GrokProcessor(pluginMetrics, grokProcessorConfig, expressionEvaluator); final Record resultRecord = buildRecordWithEvent(resultData); @@ -422,7 +436,8 @@ public void testPatternsDirWithCustomPatternsFilesGlob() throws JsonProcessingEx pluginSetting.getSettings().put(GrokProcessorConfig.MATCH, matchConfig); pluginSetting.getSettings().put(GrokProcessorConfig.PATTERNS_DIRECTORIES, patternsDirectories); pluginSetting.getSettings().put(GrokProcessorConfig.PATTERNS_FILES_GLOB, "*1.txt"); - grokProcessor = new GrokProcessor(pluginSetting, expressionEvaluator); + grokProcessorConfig = OBJECT_MAPPER.convertValue(pluginSetting.getSettings(), GrokProcessorConfig.class); + grokProcessor = new GrokProcessor(pluginMetrics, grokProcessorConfig, expressionEvaluator); final Record resultRecord = buildRecordWithEvent(resultData); @@ -436,8 +451,10 @@ public void testPatternsDirWithCustomPatternsFilesGlob() throws JsonProcessingEx matchConfigWithPatterns2Pattern.put("message", Collections.singletonList("My birthday is %{CUSTOMBIRTHDAYPATTERN:my_birthday}")); pluginSetting.getSettings().put(GrokProcessorConfig.MATCH, matchConfigWithPatterns2Pattern); + grokProcessorConfig = OBJECT_MAPPER.convertValue(pluginSetting.getSettings(), GrokProcessorConfig.class); - Throwable throwable = assertThrows(IllegalArgumentException.class, () -> new GrokProcessor(pluginSetting, expressionEvaluator)); + Throwable throwable = assertThrows(IllegalArgumentException.class, () -> new GrokProcessor( + pluginMetrics, grokProcessorConfig, expressionEvaluator)); assertThat("No definition for key 'CUSTOMBIRTHDAYPATTERN' found, aborting", equalTo(throwable.getMessage())); } @@ -447,7 +464,8 @@ public void testMatchWithNamedCapturesSyntax() throws JsonProcessingException { matchConfig.put("message", Collections.singletonList("%{GREEDYDATA:greedy_data} (?\\d\\d\\d-\\d\\d\\d-\\d\\d\\d)")); pluginSetting.getSettings().put(GrokProcessorConfig.MATCH, matchConfig); - grokProcessor = new GrokProcessor(pluginSetting, expressionEvaluator); + grokProcessorConfig = OBJECT_MAPPER.convertValue(pluginSetting.getSettings(), GrokProcessorConfig.class); + grokProcessor = new GrokProcessor(pluginMetrics, grokProcessorConfig, expressionEvaluator); final Map testData = new HashMap(); testData.put("message", "This is my greedy data before matching with my phone number 123-456-789"); @@ -477,7 +495,8 @@ public void testMatchWithNoCapturesAndTags() throws JsonProcessingException { pluginSetting.getSettings().put(GrokProcessorConfig.MATCH, matchConfig); pluginSetting.getSettings().put(GrokProcessorConfig.TAGS_ON_MATCH_FAILURE, List.of(tagOnMatchFailure1, tagOnMatchFailure2)); - grokProcessor = new GrokProcessor(pluginSetting, expressionEvaluator); + grokProcessorConfig = OBJECT_MAPPER.convertValue(pluginSetting.getSettings(), GrokProcessorConfig.class); + grokProcessor = new GrokProcessor(pluginMetrics, grokProcessorConfig, expressionEvaluator); final Map testData = new HashMap(); testData.put("log", "This is my greedy data before matching with my phone number 123-456-789"); @@ -495,14 +514,16 @@ public void testMatchWithNoCapturesAndTags() throws JsonProcessingException { @Test public void testCompileNonRegisteredPatternThrowsIllegalArgumentException() { - grokProcessor = new GrokProcessor(pluginSetting, expressionEvaluator); + grokProcessor = new GrokProcessor(pluginMetrics, grokProcessorConfig, expressionEvaluator); final Map> matchConfig = new HashMap<>(); matchConfig.put("message", Collections.singletonList("%{NONEXISTENTPATTERN}")); pluginSetting.getSettings().put(GrokProcessorConfig.MATCH, matchConfig); + grokProcessorConfig = OBJECT_MAPPER.convertValue(pluginSetting.getSettings(), GrokProcessorConfig.class); - assertThrows(IllegalArgumentException.class, () -> new GrokProcessor(pluginSetting, expressionEvaluator)); + assertThrows(IllegalArgumentException.class, () -> new GrokProcessor( + pluginMetrics, grokProcessorConfig, expressionEvaluator)); } @ParameterizedTest @@ -512,7 +533,8 @@ void testDataPrepperBuiltInGrokPatterns(final String matchPattern, final String matchConfig.put("message", Collections.singletonList(matchPattern)); pluginSetting.getSettings().put(GrokProcessorConfig.MATCH, matchConfig); - grokProcessor = new GrokProcessor(pluginSetting, expressionEvaluator); + grokProcessorConfig = OBJECT_MAPPER.convertValue(pluginSetting.getSettings(), GrokProcessorConfig.class); + grokProcessor = new GrokProcessor(pluginMetrics, grokProcessorConfig, expressionEvaluator); final Map testData = new HashMap(); testData.put("message", logInput); diff --git a/data-prepper-plugins/grok-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/grok/GrokProcessorTests.java b/data-prepper-plugins/grok-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/grok/GrokProcessorTests.java index e9d17121d8..aedad1fe5c 100644 --- a/data-prepper-plugins/grok-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/grok/GrokProcessorTests.java +++ b/data-prepper-plugins/grok-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/grok/GrokProcessorTests.java @@ -20,11 +20,9 @@ import org.junit.jupiter.params.ParameterizedTest; import org.junit.jupiter.params.provider.ValueSource; import org.mockito.Mock; -import org.mockito.MockedStatic; import org.mockito.junit.jupiter.MockitoExtension; import org.opensearch.dataprepper.expression.ExpressionEvaluator; import org.opensearch.dataprepper.metrics.PluginMetrics; -import org.opensearch.dataprepper.model.configuration.PluginSetting; import org.opensearch.dataprepper.model.event.Event; import org.opensearch.dataprepper.model.event.JacksonEvent; import org.opensearch.dataprepper.model.record.Record; @@ -52,7 +50,6 @@ import static org.mockito.ArgumentMatchers.eq; import static org.mockito.Mockito.any; import static org.mockito.Mockito.lenient; -import static org.mockito.Mockito.mockStatic; import static org.mockito.Mockito.times; import static org.mockito.Mockito.verify; import static org.mockito.Mockito.verifyNoInteractions; @@ -109,23 +106,22 @@ public class GrokProcessorTests { @Mock private ExpressionEvaluator expressionEvaluator; - - private PluginSetting pluginSetting; + @Mock + private GrokProcessorConfig grokProcessorConfig; private final String PLUGIN_NAME = "grok"; private Map capture; private final Map> matchConfig = new HashMap<>(); @BeforeEach public void setup() throws TimeoutException, ExecutionException, InterruptedException { - pluginSetting = getDefaultPluginSetting(); - pluginSetting.setPipelineName("grokPipeline"); + configureDefaultGrokProcessorConfig(); final List matchPatterns = new ArrayList<>(); matchPatterns.add("%{PATTERN1}"); matchPatterns.add("%{PATTERN2}"); matchConfig.put("message", matchPatterns); - pluginSetting.getSettings().put(GrokProcessorConfig.MATCH, matchConfig); + when(grokProcessorConfig.getMatch()).thenReturn(matchConfig); lenient().when(pluginMetrics.counter(GrokProcessor.GROK_PROCESSING_MATCH)).thenReturn(grokProcessingMatchCounter); lenient().when(pluginMetrics.counter(GrokProcessor.GROK_PROCESSING_MISMATCH)).thenReturn(grokProcessingMismatchCounter); @@ -155,15 +151,13 @@ public void setup() throws TimeoutException, ExecutionException, InterruptedExce } private GrokProcessor createObjectUnderTest() { - try (MockedStatic pluginMetricsMockedStatic = mockStatic(PluginMetrics.class)) { - pluginMetricsMockedStatic.when(() -> PluginMetrics.fromPluginSetting(pluginSetting)).thenReturn(pluginMetrics); - return new GrokProcessor(pluginSetting, grokCompiler, executorService, expressionEvaluator); - } + return new GrokProcessor( + pluginMetrics, grokProcessorConfig, grokCompiler, executorService, expressionEvaluator); } @Test public void testMatchMerge() throws JsonProcessingException, ExecutionException, InterruptedException, TimeoutException { - pluginSetting.getSettings().put(GrokProcessorConfig.INCLUDE_PERFORMANCE_METADATA, false); + when(grokProcessorConfig.getIncludePerformanceMetadata()).thenReturn(false); grokProcessor = createObjectUnderTest(); @@ -202,7 +196,7 @@ public void testMatchMerge() throws JsonProcessingException, ExecutionException, @Test public void testTarget() throws JsonProcessingException, ExecutionException, InterruptedException, TimeoutException { - pluginSetting.getSettings().put(GrokProcessorConfig.TARGET_KEY, "test_target"); + when(grokProcessorConfig.getTargetKey()).thenReturn("test_target"); grokProcessor = createObjectUnderTest(); capture.put("key_capture_1", "value_capture_1"); @@ -238,7 +232,7 @@ public void testTarget() throws JsonProcessingException, ExecutionException, Int @Test public void testOverwrite() throws JsonProcessingException { - pluginSetting.getSettings().put(GrokProcessorConfig.KEYS_TO_OVERWRITE, Collections.singletonList("message")); + when(grokProcessorConfig.getkeysToOverwrite()).thenReturn(Collections.singletonList("message")); grokProcessor = createObjectUnderTest(); capture.put("key_capture_1", "value_capture_1"); @@ -423,7 +417,7 @@ public void testThatTimeoutExceptionIsCaughtAndProcessingContinues() throws Json @Test public void testThatProcessingWithTimeoutMillisOfZeroDoesNotInteractWithExecutorServiceAndReturnsCorrectResult() throws JsonProcessingException { - pluginSetting.getSettings().put(GrokProcessorConfig.TIMEOUT_MILLIS, 0); + when(grokProcessorConfig.getTimeoutMillis()).thenReturn(0); grokProcessor = createObjectUnderTest(); capture.put("key_capture_1", "value_capture_1"); @@ -528,7 +522,7 @@ public void testNoCaptures() throws JsonProcessingException { @Test public void testMatchOnSecondPattern() throws JsonProcessingException { - pluginSetting.getSettings().put(GrokProcessorConfig.INCLUDE_PERFORMANCE_METADATA, true); + when(grokProcessorConfig.getIncludePerformanceMetadata()).thenReturn(true); when(match.capture()).thenReturn(Collections.emptyMap()); when(grokSecondMatch.match(messageInput)).thenReturn(secondMatch); @@ -556,7 +550,7 @@ public void testMatchOnSecondPattern() throws JsonProcessingException { @Test public void testMatchOnSecondPatternWithExistingMetadataForTotalPatternMatches() throws JsonProcessingException { - pluginSetting.getSettings().put(GrokProcessorConfig.INCLUDE_PERFORMANCE_METADATA, true); + when(grokProcessorConfig.getIncludePerformanceMetadata()).thenReturn(true); when(match.capture()).thenReturn(Collections.emptyMap()); when(grokSecondMatch.match(messageInput)).thenReturn(secondMatch); @@ -598,8 +592,10 @@ void setUp() { tagOnMatchFailure2 = UUID.randomUUID().toString(); tagOnTimeout1 = UUID.randomUUID().toString(); tagOnTimeout2 = UUID.randomUUID().toString(); - pluginSetting.getSettings().put(GrokProcessorConfig.TAGS_ON_MATCH_FAILURE, List.of(tagOnMatchFailure1, tagOnMatchFailure2)); - pluginSetting.getSettings().put(GrokProcessorConfig.TAGS_ON_TIMEOUT, List.of(tagOnTimeout1, tagOnTimeout2)); + when(grokProcessorConfig.getTagsOnMatchFailure()).thenReturn( + List.of(tagOnMatchFailure1, tagOnMatchFailure2)); + when(grokProcessorConfig.getTagsOnTimeout()).thenReturn( + List.of(tagOnTimeout1, tagOnTimeout2)); } @Test @@ -654,6 +650,34 @@ public void timeout_exception_tags_the_event() throws JsonProcessingException, T verifyNoInteractions(grokProcessingErrorsCounter, grokProcessingMismatchCounter); } + @Test + public void timeout_exception_tags_the_event_with_tags_on_match_failure() + throws JsonProcessingException, TimeoutException, ExecutionException, InterruptedException { + when(grokProcessorConfig.getTagsOnTimeout()).thenReturn(Collections.emptyList()); + when(task.get(GrokProcessorConfig.DEFAULT_TIMEOUT_MILLIS, TimeUnit.MILLISECONDS)).thenThrow(TimeoutException.class); + + grokProcessor = createObjectUnderTest(); + + capture.put("key_capture_1", "value_capture_1"); + capture.put("key_capture_2", "value_capture_2"); + capture.put("key_capture_3", "value_capture_3"); + + final Map testData = new HashMap(); + testData.put("message", messageInput); + final Record record = buildRecordWithEvent(testData); + + final List> grokkedRecords = (List>) grokProcessor.doExecute(Collections.singletonList(record)); + + assertThat(grokkedRecords.size(), equalTo(1)); + assertThat(grokkedRecords.get(0), notNullValue()); + assertRecordsAreEqual(grokkedRecords.get(0), record); + assertThat(record.getData().getMetadata().getTags(), hasItem(tagOnMatchFailure1)); + assertThat(record.getData().getMetadata().getTags(), hasItem(tagOnMatchFailure2)); + verify(grokProcessingTimeoutsCounter, times(1)).increment(); + verify(grokProcessingTime, times(1)).record(any(Runnable.class)); + verifyNoInteractions(grokProcessingErrorsCounter, grokProcessingMismatchCounter); + } + @ParameterizedTest @ValueSource(classes = {ExecutionException.class, InterruptedException.class, RuntimeException.class}) public void execution_exception_tags_the_event(Class exceptionClass) throws JsonProcessingException, TimeoutException, ExecutionException, InterruptedException { @@ -720,7 +744,7 @@ public void testBreakOnMatchTrue() throws JsonProcessingException { @Test public void testBreakOnMatchFalse() throws JsonProcessingException { - pluginSetting.getSettings().put(GrokProcessorConfig.BREAK_ON_MATCH, false); + when(grokProcessorConfig.isBreakOnMatch()).thenReturn(false); grokProcessor = createObjectUnderTest(); when(grokSecondMatch.match(messageInput)).thenReturn(secondMatch); @@ -756,10 +780,8 @@ public void testBreakOnMatchFalse() throws JsonProcessingException { } } - private PluginSetting getDefaultPluginSetting() { - - return completePluginSettingForGrokProcessor( - GrokProcessorConfig.DEFAULT_BREAK_ON_MATCH, + private void configureDefaultGrokProcessorConfig() { + completeMockGrokProcessorConfig(GrokProcessorConfig.DEFAULT_BREAK_ON_MATCH, GrokProcessorConfig.DEFAULT_KEEP_EMPTY_CAPTURES, matchConfig, GrokProcessorConfig.DEFAULT_NAMED_CAPTURES_ONLY, @@ -775,7 +797,7 @@ private PluginSetting getDefaultPluginSetting() { @Test public void testNoGrok_when_GrokWhen_returns_false() throws JsonProcessingException { final String grokWhen = UUID.randomUUID().toString(); - pluginSetting.getSettings().put(GrokProcessorConfig.GROK_WHEN, grokWhen); + when(grokProcessorConfig.getGrokWhen()).thenReturn(grokWhen); grokProcessor = createObjectUnderTest(); capture.put("key_capture_1", "value_capture_1"); @@ -796,31 +818,28 @@ public void testNoGrok_when_GrokWhen_returns_false() throws JsonProcessingExcept verifyNoInteractions(grok, grokSecondMatch); } - private PluginSetting completePluginSettingForGrokProcessor(final boolean breakOnMatch, - final boolean keepEmptyCaptures, - final Map> match, - final boolean namedCapturesOnly, - final List keysToOverwrite, - final List patternsDirectories, - final String patternsFilesGlob, - final Map patternDefinitions, - final int timeoutMillis, - final String targetKey, - final String grokWhen) { - final Map settings = new HashMap<>(); - settings.put(GrokProcessorConfig.BREAK_ON_MATCH, breakOnMatch); - settings.put(GrokProcessorConfig.NAMED_CAPTURES_ONLY, namedCapturesOnly); - settings.put(GrokProcessorConfig.MATCH, match); - settings.put(GrokProcessorConfig.KEEP_EMPTY_CAPTURES, keepEmptyCaptures); - settings.put(GrokProcessorConfig.KEYS_TO_OVERWRITE, keysToOverwrite); - settings.put(GrokProcessorConfig.PATTERNS_DIRECTORIES, patternsDirectories); - settings.put(GrokProcessorConfig.PATTERN_DEFINITIONS, patternDefinitions); - settings.put(GrokProcessorConfig.PATTERNS_FILES_GLOB, patternsFilesGlob); - settings.put(GrokProcessorConfig.TIMEOUT_MILLIS, timeoutMillis); - settings.put(GrokProcessorConfig.TARGET_KEY, targetKey); - settings.put(GrokProcessorConfig.GROK_WHEN, grokWhen); - - return new PluginSetting(PLUGIN_NAME, settings); + private void completeMockGrokProcessorConfig(final boolean breakOnMatch, + final boolean keepEmptyCaptures, + final Map> match, + final boolean namedCapturesOnly, + final List keysToOverwrite, + final List patternsDirectories, + final String patternsFilesGlob, + final Map patternDefinitions, + final int timeoutMillis, + final String targetKey, + final String grokWhen) { + lenient().when(grokProcessorConfig.isBreakOnMatch()).thenReturn(breakOnMatch); + lenient().when(grokProcessorConfig.isNamedCapturesOnly()).thenReturn(namedCapturesOnly); + lenient().when(grokProcessorConfig.getMatch()).thenReturn(match); + lenient().when(grokProcessorConfig.isKeepEmptyCaptures()).thenReturn(keepEmptyCaptures); + lenient().when(grokProcessorConfig.getkeysToOverwrite()).thenReturn(keysToOverwrite); + lenient().when(grokProcessorConfig.getPatternsDirectories()).thenReturn(patternsDirectories); + lenient().when(grokProcessorConfig.getPatternDefinitions()).thenReturn(patternDefinitions); + lenient().when(grokProcessorConfig.getPatternsFilesGlob()).thenReturn(patternsFilesGlob); + lenient().when(grokProcessorConfig.getTimeoutMillis()).thenReturn(timeoutMillis); + lenient().when(grokProcessorConfig.getTargetKey()).thenReturn(targetKey); + lenient().when(grokProcessorConfig.getGrokWhen()).thenReturn(grokWhen); } private void assertRecordsAreEqual(final Record first, final Record second) throws JsonProcessingException { diff --git a/data-prepper-plugins/http-common/build.gradle b/data-prepper-plugins/http-common/build.gradle index fa0e1c3efb..54fa5d346d 100644 --- a/data-prepper-plugins/http-common/build.gradle +++ b/data-prepper-plugins/http-common/build.gradle @@ -6,7 +6,6 @@ dependencies { implementation 'org.apache.httpcomponents:httpcore:4.4.16' testImplementation testLibs.bundles.junit - testImplementation testLibs.mockito.inline } jacocoTestCoverageVerification { diff --git a/data-prepper-plugins/http-sink/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker b/data-prepper-plugins/http-sink/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker deleted file mode 100644 index 23c33feb6d..0000000000 --- a/data-prepper-plugins/http-sink/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker +++ /dev/null @@ -1,3 +0,0 @@ -# To enable mocking of final classes with vanilla Mockito -# https://github.com/mockito/mockito/wiki/What%27s-new-in-Mockito-2#mock-the-unmockable-opt-in-mocking-of-final-classesmethods -mock-maker-inline diff --git a/data-prepper-plugins/http-source-common/src/main/java/org/opensearch/dataprepper/http/codec/JsonCodec.java b/data-prepper-plugins/http-source-common/src/main/java/org/opensearch/dataprepper/http/codec/JsonCodec.java index fc25193a9d..4c0020a83e 100644 --- a/data-prepper-plugins/http-source-common/src/main/java/org/opensearch/dataprepper/http/codec/JsonCodec.java +++ b/data-prepper-plugins/http-source-common/src/main/java/org/opensearch/dataprepper/http/codec/JsonCodec.java @@ -10,6 +10,7 @@ import com.linecorp.armeria.common.HttpData; import java.io.IOException; +import java.nio.charset.Charset; import java.util.ArrayList; import java.util.List; import java.util.Map; @@ -56,7 +57,7 @@ public List> parse(HttpData httpData, int maxSize) throws IOExcepti size = OVERHEAD_CHARACTERS.length(); } innerJsonList.add(recordString); - size += recordString.length() + COMMA_OVERHEAD_LENGTH; + size += recordString.getBytes(Charset.defaultCharset()).length + COMMA_OVERHEAD_LENGTH; } if (size > OVERHEAD_CHARACTERS.length()) { jsonList.add(innerJsonList); diff --git a/data-prepper-plugins/http-source-common/src/test/java/org/opensearch/dataprepper/http/codec/JsonCodecTest.java b/data-prepper-plugins/http-source-common/src/test/java/org/opensearch/dataprepper/http/codec/JsonCodecTest.java index 4863667bc0..8843d8d6e7 100644 --- a/data-prepper-plugins/http-source-common/src/test/java/org/opensearch/dataprepper/http/codec/JsonCodecTest.java +++ b/data-prepper-plugins/http-source-common/src/test/java/org/opensearch/dataprepper/http/codec/JsonCodecTest.java @@ -7,16 +7,31 @@ import com.linecorp.armeria.common.HttpData; import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtensionContext; +import org.junit.jupiter.params.ParameterizedTest; +import org.junit.jupiter.params.provider.Arguments; +import org.junit.jupiter.params.provider.ArgumentsProvider; +import org.junit.jupiter.params.provider.ArgumentsSource; import java.io.IOException; +import java.nio.charset.Charset; import java.util.List; +import java.util.stream.Collectors; +import java.util.stream.Stream; +import static org.hamcrest.CoreMatchers.equalTo; +import static org.hamcrest.CoreMatchers.notNullValue; +import static org.hamcrest.MatcherAssert.assertThat; +import static org.hamcrest.Matchers.greaterThanOrEqualTo; +import static org.hamcrest.Matchers.lessThanOrEqualTo; import static org.junit.jupiter.api.Assertions.assertEquals; import static org.junit.jupiter.api.Assertions.assertThrows; +import static org.junit.jupiter.params.provider.Arguments.arguments; class JsonCodecTest { private final HttpData goodTestData = HttpData.ofUtf8("[{\"a\":\"b\"}, {\"c\":\"d\"}]"); private final HttpData goodLargeTestData = HttpData.ofUtf8("[{\"a1\":\"b1\"}, {\"a2\":\"b2\"}, {\"a3\":\"b3\"}, {\"a4\":\"b4\"}, {\"a5\":\"b5\"}]"); + private final HttpData goodLargeTestDataUnicode = HttpData.ofUtf8("[{\"ὊὊὊ1\":\"ὊὊὊ1\"}, {\"ὊὊὊ2\":\"ὊὊὊ2\"}, {\"a3\":\"b3\"}, {\"ὊὊὊ4\":\"ὊὊὊ4\"}]"); private final HttpData badTestDataJsonLine = HttpData.ofUtf8("{\"a\":\"b\"}"); private final HttpData badTestDataMultiJsonLines = HttpData.ofUtf8("{\"a\":\"b\"}{\"c\":\"d\"}"); private final HttpData badTestDataNonJson = HttpData.ofUtf8("non json content"); @@ -51,6 +66,25 @@ public void testParseSuccessWithMaxSize() throws IOException { assertEquals("{\"a5\":\"b5\"}", res.get(2).get(0)); } + @ParameterizedTest + @ArgumentsSource(JsonArrayWithKnownFirstArgumentsProvider.class) + public void parse_should_return_lists_smaller_than_provided_length( + final String inputJsonArray, final String knownFirstPart) throws IOException { + final int knownSingleBodySize = knownFirstPart.getBytes(Charset.defaultCharset()).length; + final int maxSize = (knownSingleBodySize * 2) + 3; + final List> chunkedBodies = objectUnderTest.parse(HttpData.ofUtf8(inputJsonArray), + maxSize); + + assertThat(chunkedBodies, notNullValue()); + assertThat(chunkedBodies.size(), greaterThanOrEqualTo(1)); + final String firstReconstructed = chunkedBodies.get(0).stream().collect(Collectors.joining(",", "[", "]")); + assertThat(firstReconstructed.getBytes(Charset.defaultCharset()).length, + lessThanOrEqualTo(maxSize)); + + assertThat(chunkedBodies.get(0).size(), greaterThanOrEqualTo(1)); + assertThat(chunkedBodies.get(0).get(0), equalTo(knownFirstPart)); + } + @Test public void testParseJsonLineFailure() { assertThrows(IOException.class, () -> objectUnderTest.parse(badTestDataJsonLine)); @@ -65,4 +99,18 @@ public void testParseMultiJsonLinesFailure() { public void testParseNonJsonFailure() { assertThrows(IOException.class, () -> objectUnderTest.parse(badTestDataNonJson)); } + + static class JsonArrayWithKnownFirstArgumentsProvider implements ArgumentsProvider { + @Override + public Stream provideArguments(ExtensionContext extensionContext) throws Exception { + return Stream.of( + arguments( + "[{\"ὊὊὊ1\":\"ὊὊὊ1\"}, {\"ὊὊὊ2\":\"ὊὊὊ2\"}, {\"a3\":\"b3\"}, {\"ὊὊὊ4\":\"ὊὊὊ4\"}]", + "{\"ὊὊὊ1\":\"ὊὊὊ1\"}"), + arguments( + "[{\"aaa1\":\"aaa1\"}, {\"aaa2\":\"aaa2\"}, {\"a3\":\"b3\"}, {\"bbb4\":\"bbb4\"}]", + "{\"aaa1\":\"aaa1\"}") + ); + } + } } \ No newline at end of file diff --git a/data-prepper-plugins/http-source-common/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker b/data-prepper-plugins/http-source-common/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker deleted file mode 100644 index 78ccc25012..0000000000 --- a/data-prepper-plugins/http-source-common/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker +++ /dev/null @@ -1,3 +0,0 @@ -# To enable mocking of final classes with vanilla Mockito -# https://github.com/mockito/mockito/wiki/What%27s-new-in-Mockito-2#mock-the-unmockable-opt-in-mocking-of-final-classesmethods -mock-maker-inline \ No newline at end of file diff --git a/data-prepper-plugins/http-source/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker b/data-prepper-plugins/http-source/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker deleted file mode 100644 index 78ccc25012..0000000000 --- a/data-prepper-plugins/http-source/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker +++ /dev/null @@ -1,3 +0,0 @@ -# To enable mocking of final classes with vanilla Mockito -# https://github.com/mockito/mockito/wiki/What%27s-new-in-Mockito-2#mock-the-unmockable-opt-in-mocking-of-final-classesmethods -mock-maker-inline \ No newline at end of file diff --git a/data-prepper-plugins/kafka-plugins/build.gradle b/data-prepper-plugins/kafka-plugins/build.gradle index c88fde1365..046aef949a 100644 --- a/data-prepper-plugins/kafka-plugins/build.gradle +++ b/data-prepper-plugins/kafka-plugins/build.gradle @@ -29,6 +29,8 @@ dependencies { implementation project(':data-prepper-plugins:buffer-common') implementation project(':data-prepper-plugins:blocking-buffer') implementation project(':data-prepper-plugins:aws-plugin-api') + // bump io.confluent:* dependencies correspondingly when bumping org.apache.kafka.* + // https://docs.confluent.io/platform/current/release-notes/index.html implementation 'org.apache.kafka:kafka-clients:3.6.1' implementation 'org.apache.kafka:connect-json:3.6.1' implementation project(':data-prepper-plugins:http-common') @@ -36,9 +38,9 @@ dependencies { implementation 'com.fasterxml.jackson.core:jackson-databind' implementation 'io.micrometer:micrometer-core' implementation libs.commons.lang3 - implementation 'io.confluent:kafka-avro-serializer:7.4.0' - implementation 'io.confluent:kafka-json-schema-serializer:7.4.0' - implementation 'io.confluent:kafka-schema-registry-client:7.4.0' + implementation 'io.confluent:kafka-avro-serializer:7.6.0' + implementation 'io.confluent:kafka-json-schema-serializer:7.6.0' + implementation 'io.confluent:kafka-schema-registry-client:7.6.0' implementation 'software.amazon.awssdk:sts' implementation 'software.amazon.awssdk:auth' implementation 'software.amazon.awssdk:kafka' @@ -51,7 +53,6 @@ dependencies { implementation 'software.amazon.awssdk:s3' implementation 'software.amazon.awssdk:apache-client' - testImplementation testLibs.mockito.inline testImplementation 'org.yaml:snakeyaml:2.2' testImplementation testLibs.spring.test testImplementation 'com.fasterxml.jackson.datatype:jackson-datatype-jsr310' @@ -60,12 +61,10 @@ dependencies { testImplementation project(':data-prepper-core') testImplementation project(':data-prepper-plugin-framework') testImplementation project(':data-prepper-pipeline-parser') - testImplementation testLibs.mockito.inline testImplementation 'org.apache.kafka:kafka_2.13:3.6.1' testImplementation 'org.apache.kafka:kafka_2.13:3.6.1:test' testImplementation 'org.apache.curator:curator-test:5.5.0' testImplementation('com.kjetland:mbknor-jackson-jsonschema_2.13:1.0.39') - testImplementation group: 'org.powermock', name: 'powermock-api-mockito2', version: '2.0.9' testImplementation project(':data-prepper-plugins:otel-metrics-source') testImplementation project(':data-prepper-plugins:otel-proto-common') testImplementation libs.opentelemetry.proto @@ -75,8 +74,8 @@ dependencies { testImplementation 'com.fasterxml.jackson.dataformat:jackson-dataformat-yaml' integrationTestImplementation testLibs.junit.vintage - integrationTestImplementation 'io.confluent:kafka-schema-registry:7.4.0' - integrationTestImplementation ('io.confluent:kafka-schema-registry:7.4.0:tests') { + integrationTestImplementation 'io.confluent:kafka-schema-registry:7.6.0' + integrationTestImplementation ('io.confluent:kafka-schema-registry:7.6.0:tests') { exclude group: 'org.glassfish.jersey.containers', module: 'jersey-container-servlet' exclude group: 'org.glassfish.jersey.inject', module: 'jersey-hk2' exclude group: 'org.glassfish.jersey.ext', module: 'jersey-bean-validation' diff --git a/data-prepper-plugins/kafka-plugins/src/main/java/org/opensearch/dataprepper/plugins/kafka/common/KafkaMdc.java b/data-prepper-plugins/kafka-plugins/src/main/java/org/opensearch/dataprepper/plugins/kafka/common/KafkaMdc.java index 9ae8985908..785d565e78 100644 --- a/data-prepper-plugins/kafka-plugins/src/main/java/org/opensearch/dataprepper/plugins/kafka/common/KafkaMdc.java +++ b/data-prepper-plugins/kafka-plugins/src/main/java/org/opensearch/dataprepper/plugins/kafka/common/KafkaMdc.java @@ -3,6 +3,8 @@ * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.dataprepper.plugins.kafka.common;public class KafkaMdc { +package org.opensearch.dataprepper.plugins.kafka.common; + +public class KafkaMdc { public static final String MDC_KAFKA_PLUGIN_KEY = "kafkaPluginType"; } diff --git a/data-prepper-plugins/kafka-plugins/src/main/java/org/opensearch/dataprepper/plugins/kafka/common/thread/KafkaPluginThreadFactory.java b/data-prepper-plugins/kafka-plugins/src/main/java/org/opensearch/dataprepper/plugins/kafka/common/thread/KafkaPluginThreadFactory.java index a05540c320..b5dede6cda 100644 --- a/data-prepper-plugins/kafka-plugins/src/main/java/org/opensearch/dataprepper/plugins/kafka/common/thread/KafkaPluginThreadFactory.java +++ b/data-prepper-plugins/kafka-plugins/src/main/java/org/opensearch/dataprepper/plugins/kafka/common/thread/KafkaPluginThreadFactory.java @@ -25,7 +25,16 @@ public class KafkaPluginThreadFactory implements ThreadFactory { final ThreadFactory delegateThreadFactory, final String kafkaPluginType) { this.delegateThreadFactory = delegateThreadFactory; - this.threadPrefix = "kafka-" + kafkaPluginType + "-"; + this.threadPrefix = createPluginPart(kafkaPluginType); + this.kafkaPluginType = kafkaPluginType; + } + + KafkaPluginThreadFactory( + final ThreadFactory delegateThreadFactory, + final String kafkaPluginType, + final String kafkaTopic) { + this.delegateThreadFactory = delegateThreadFactory; + this.threadPrefix = normalizeName(kafkaTopic) + "-" + createPluginPart(kafkaPluginType); this.kafkaPluginType = kafkaPluginType; } @@ -39,6 +48,28 @@ public static KafkaPluginThreadFactory defaultExecutorThreadFactory(final String return new KafkaPluginThreadFactory(Executors.defaultThreadFactory(), kafkaPluginType); } + /** + * Creates an instance specifically for use with {@link Executors}. + * + * @param kafkaPluginType The name of the plugin type. e.g. sink, source, buffer + * @return An instance of the {@link KafkaPluginThreadFactory}. + */ + public static KafkaPluginThreadFactory defaultExecutorThreadFactory( + final String kafkaPluginType, + final String kafkaTopic) { + return new KafkaPluginThreadFactory(Executors.defaultThreadFactory(), kafkaPluginType, kafkaTopic); + } + + private static String createPluginPart(final String kafkaPluginType) { + return "kafka-" + kafkaPluginType + "-"; + } + + private static String normalizeName(final String kafkaTopic) { + final String limitedName = kafkaTopic.length() > 20 ? kafkaTopic.substring(0, 20) : kafkaTopic; + return limitedName + .toLowerCase().replaceAll("[^a-z0-9]", "-"); + } + @Override public Thread newThread(final Runnable runnable) { final Thread thread = delegateThreadFactory.newThread(() -> { diff --git a/data-prepper-plugins/kafka-plugins/src/main/java/org/opensearch/dataprepper/plugins/kafka/consumer/KafkaCustomConsumerFactory.java b/data-prepper-plugins/kafka-plugins/src/main/java/org/opensearch/dataprepper/plugins/kafka/consumer/KafkaCustomConsumerFactory.java index d703538e42..0d091b8af7 100644 --- a/data-prepper-plugins/kafka-plugins/src/main/java/org/opensearch/dataprepper/plugins/kafka/consumer/KafkaCustomConsumerFactory.java +++ b/data-prepper-plugins/kafka-plugins/src/main/java/org/opensearch/dataprepper/plugins/kafka/consumer/KafkaCustomConsumerFactory.java @@ -55,6 +55,8 @@ import java.util.concurrent.atomic.AtomicBoolean; import java.util.stream.IntStream; +import static org.opensearch.dataprepper.logging.DataPrepperMarkers.SENSITIVE; + public class KafkaCustomConsumerFactory { private static final Logger LOG = LoggerFactory.getLogger(KafkaCustomConsumerFactory.class); @@ -136,7 +138,7 @@ private Properties getConsumerProperties(final KafkaConsumerConfig sourceConfig, } setConsumerTopicProperties(properties, topicConfig, topicConfig.getGroupId()); setSchemaRegistryProperties(sourceConfig, properties, topicConfig); - LOG.debug("Starting consumer with the properties : {}", properties); + LOG.debug(SENSITIVE, "Starting consumer with the properties : {}", properties); return properties; } diff --git a/data-prepper-plugins/kafka-plugins/src/main/java/org/opensearch/dataprepper/plugins/kafka/source/KafkaSource.java b/data-prepper-plugins/kafka-plugins/src/main/java/org/opensearch/dataprepper/plugins/kafka/source/KafkaSource.java index 6a01a91bf0..e235594ce2 100644 --- a/data-prepper-plugins/kafka-plugins/src/main/java/org/opensearch/dataprepper/plugins/kafka/source/KafkaSource.java +++ b/data-prepper-plugins/kafka-plugins/src/main/java/org/opensearch/dataprepper/plugins/kafka/source/KafkaSource.java @@ -29,6 +29,8 @@ import org.opensearch.dataprepper.model.plugin.PluginConfigObservable; import org.opensearch.dataprepper.model.record.Record; import org.opensearch.dataprepper.model.source.Source; +import org.opensearch.dataprepper.plugins.kafka.common.KafkaMdc; +import org.opensearch.dataprepper.plugins.kafka.common.thread.KafkaPluginThreadFactory; import org.opensearch.dataprepper.plugins.kafka.configuration.AuthConfig; import org.opensearch.dataprepper.plugins.kafka.configuration.TopicConsumerConfig; import org.opensearch.dataprepper.plugins.kafka.configuration.OAuthConfig; @@ -46,6 +48,7 @@ import org.opensearch.dataprepper.plugins.kafka.util.MessageFormat; import org.slf4j.Logger; import org.slf4j.LoggerFactory; +import org.slf4j.MDC; import java.io.IOException; import java.util.ArrayList; @@ -61,6 +64,8 @@ import java.util.concurrent.atomic.AtomicBoolean; import java.util.stream.IntStream; +import static org.opensearch.dataprepper.logging.DataPrepperMarkers.SENSITIVE; + /** * The starting point of the Kafka-source plugin and the Kafka consumer * properties and kafka multithreaded consumers are being handled here. @@ -71,10 +76,10 @@ public class KafkaSource implements Source> { private static final String NO_RESOLVABLE_URLS_ERROR_MESSAGE = "No resolvable bootstrap urls given in bootstrap.servers"; private static final long RETRY_SLEEP_INTERVAL = 30000; + private static final String MDC_KAFKA_PLUGIN_VALUE = "source"; private static final Logger LOG = LoggerFactory.getLogger(KafkaSource.class); private final KafkaSourceConfig sourceConfig; private final AtomicBoolean shutdownInProgress; - private ExecutorService executorService; private final PluginMetrics pluginMetrics; private KafkaCustomConsumer consumer; private KafkaConsumer kafkaConsumer; @@ -110,59 +115,65 @@ public KafkaSource(final KafkaSourceConfig sourceConfig, @Override public void start(Buffer> buffer) { - Properties authProperties = new Properties(); - KafkaSecurityConfigurer.setDynamicSaslClientCallbackHandler(authProperties, sourceConfig, pluginConfigObservable); - KafkaSecurityConfigurer.setAuthProperties(authProperties, sourceConfig, LOG); - sourceConfig.getTopics().forEach(topic -> { - consumerGroupID = topic.getGroupId(); - KafkaTopicConsumerMetrics topicMetrics = new KafkaTopicConsumerMetrics(topic.getName(), pluginMetrics, true); - Properties consumerProperties = getConsumerProperties(topic, authProperties); - MessageFormat schema = MessageFormat.getByMessageFormatByName(schemaType); - try { - int numWorkers = topic.getWorkers(); - executorService = Executors.newFixedThreadPool(numWorkers); - allTopicExecutorServices.add(executorService); - - IntStream.range(0, numWorkers).forEach(index -> { - while (true) { - try { - kafkaConsumer = createKafkaConsumer(schema, consumerProperties); - break; - } catch (ConfigException ce) { - if (ce.getMessage().contains(NO_RESOLVABLE_URLS_ERROR_MESSAGE)) { - LOG.warn("Exception while creating Kafka consumer: ", ce); - LOG.warn("Bootstrap URL could not be resolved. Retrying in {} ms...", RETRY_SLEEP_INTERVAL); - try { - sleep(RETRY_SLEEP_INTERVAL); - } catch (InterruptedException ie) { - Thread.currentThread().interrupt(); - throw new RuntimeException(ie); + try { + setMdc(); + Properties authProperties = new Properties(); + KafkaSecurityConfigurer.setDynamicSaslClientCallbackHandler(authProperties, sourceConfig, pluginConfigObservable); + KafkaSecurityConfigurer.setAuthProperties(authProperties, sourceConfig, LOG); + sourceConfig.getTopics().forEach(topic -> { + consumerGroupID = topic.getGroupId(); + KafkaTopicConsumerMetrics topicMetrics = new KafkaTopicConsumerMetrics(topic.getName(), pluginMetrics, true); + Properties consumerProperties = getConsumerProperties(topic, authProperties); + MessageFormat schema = MessageFormat.getByMessageFormatByName(schemaType); + try { + int numWorkers = topic.getWorkers(); + final ExecutorService executorService = Executors.newFixedThreadPool( + numWorkers, KafkaPluginThreadFactory.defaultExecutorThreadFactory(MDC_KAFKA_PLUGIN_VALUE, topic.getName())); + allTopicExecutorServices.add(executorService); + + IntStream.range(0, numWorkers).forEach(index -> { + while (true) { + try { + kafkaConsumer = createKafkaConsumer(schema, consumerProperties); + break; + } catch (ConfigException ce) { + if (ce.getMessage().contains(NO_RESOLVABLE_URLS_ERROR_MESSAGE)) { + LOG.warn("Exception while creating Kafka consumer: ", ce); + LOG.warn("Bootstrap URL could not be resolved. Retrying in {} ms...", RETRY_SLEEP_INTERVAL); + try { + sleep(RETRY_SLEEP_INTERVAL); + } catch (InterruptedException ie) { + Thread.currentThread().interrupt(); + throw new RuntimeException(ie); + } + } else { + throw ce; } - } else { - throw ce; } + } + consumer = new KafkaCustomConsumer(kafkaConsumer, shutdownInProgress, buffer, sourceConfig, topic, schemaType, + acknowledgementSetManager, null, topicMetrics, PauseConsumePredicate.noPause()); + allTopicConsumers.add(consumer); + executorService.submit(consumer); + }); + } catch (Exception e) { + if (e instanceof BrokerNotAvailableException || e instanceof TimeoutException) { + LOG.error("The kafka broker is not available..."); + } else { + LOG.error("Failed to setup the Kafka Source Plugin.", e); } - consumer = new KafkaCustomConsumer(kafkaConsumer, shutdownInProgress, buffer, sourceConfig, topic, schemaType, - acknowledgementSetManager, null, topicMetrics, PauseConsumePredicate.noPause()); - allTopicConsumers.add(consumer); - - executorService.submit(consumer); - }); - } catch (Exception e) { - if (e instanceof BrokerNotAvailableException || e instanceof TimeoutException) { - LOG.error("The kafka broker is not available..."); - } else { - LOG.error("Failed to setup the Kafka Source Plugin.", e); + throw new RuntimeException(e); } - throw new RuntimeException(e); - } - LOG.info("Started Kafka source for topic " + topic.getName()); - }); + LOG.info("Started Kafka source for topic " + topic.getName()); + }); + } finally { + removeMdc(); + } } - public KafkaConsumer createKafkaConsumer(final MessageFormat schema, final Properties consumerProperties) { + KafkaConsumer createKafkaConsumer(final MessageFormat schema, final Properties consumerProperties) { switch (schema) { case JSON: return new KafkaConsumer(consumerProperties); @@ -181,19 +192,24 @@ public void start(Buffer> buffer) { @Override public void stop() { - shutdownInProgress.set(true); - final long shutdownWaitTime = calculateLongestThreadWaitingTime(); + try { + setMdc(); + shutdownInProgress.set(true); + final long shutdownWaitTime = calculateLongestThreadWaitingTime(); - LOG.info("Shutting down {} Executor services", allTopicExecutorServices.size()); - allTopicExecutorServices.forEach(executor -> stopExecutor(executor, shutdownWaitTime)); + LOG.info("Shutting down {} Executor services", allTopicExecutorServices.size()); + allTopicExecutorServices.forEach(executor -> stopExecutor(executor, shutdownWaitTime)); - LOG.info("Closing {} consumers", allTopicConsumers.size()); - allTopicConsumers.forEach(consumer -> consumer.closeConsumer()); + LOG.info("Closing {} consumers", allTopicConsumers.size()); + allTopicConsumers.forEach(consumer -> consumer.closeConsumer()); - LOG.info("Kafka source shutdown successfully..."); + LOG.info("Kafka source shutdown successfully..."); + } finally { + removeMdc(); + } } - public void stopExecutor(final ExecutorService executorService, final long shutdownWaitTime) { + private void stopExecutor(final ExecutorService executorService, final long shutdownWaitTime) { executorService.shutdown(); try { if (!executorService.awaitTermination(shutdownWaitTime, TimeUnit.SECONDS)) { @@ -241,7 +257,7 @@ private Properties getConsumerProperties(final TopicConsumerConfig topicConfig, } setConsumerTopicProperties(properties, topicConfig); setSchemaRegistryProperties(properties, topicConfig); - LOG.info("Starting consumer with the properties : {}", properties); + LOG.debug(SENSITIVE, "Starting consumer with the properties : {}", properties); return properties; } @@ -344,7 +360,7 @@ private void setPropertiesForSchemaRegistryConnectivity(Properties properties) { } } - protected void sleep(final long millis) throws InterruptedException { + void sleep(final long millis) throws InterruptedException { Thread.sleep(millis); } @@ -364,4 +380,12 @@ private void updateConfig(final KafkaClusterConfigSupplier kafkaClusterConfigSup } } } + + private static void setMdc() { + MDC.put(KafkaMdc.MDC_KAFKA_PLUGIN_KEY, MDC_KAFKA_PLUGIN_VALUE); + } + + private static void removeMdc() { + MDC.remove(KafkaMdc.MDC_KAFKA_PLUGIN_KEY); + } } diff --git a/data-prepper-plugins/kafka-plugins/src/main/java/org/opensearch/dataprepper/plugins/kafka/util/KafkaSecurityConfigurer.java b/data-prepper-plugins/kafka-plugins/src/main/java/org/opensearch/dataprepper/plugins/kafka/util/KafkaSecurityConfigurer.java index a5e27e4d98..402f248ddf 100644 --- a/data-prepper-plugins/kafka-plugins/src/main/java/org/opensearch/dataprepper/plugins/kafka/util/KafkaSecurityConfigurer.java +++ b/data-prepper-plugins/kafka-plugins/src/main/java/org/opensearch/dataprepper/plugins/kafka/util/KafkaSecurityConfigurer.java @@ -92,7 +92,8 @@ public class KafkaSecurityConfigurer { private static final String SSL_TRUSTSTORE_LOCATION = "ssl.truststore.location"; private static final String SSL_TRUSTSTORE_PASSWORD = "ssl.truststore.password"; - private static AwsCredentialsProvider credentialsProvider; + private static AwsCredentialsProvider mskCredentialsProvider; + private static AwsCredentialsProvider awsGlueCredentialsProvider; private static GlueSchemaRegistryKafkaDeserializer glueDeserializer; @@ -207,6 +208,9 @@ public static void setAwsIamAuthProperties(Properties properties, final AwsIamAu properties.put(SASL_MECHANISM, "AWS_MSK_IAM"); properties.put(SASL_CLIENT_CALLBACK_HANDLER_CLASS, "software.amazon.msk.auth.iam.IAMClientCallbackHandler"); if (awsIamAuthConfig == AwsIamAuthConfig.ROLE) { + if (Objects.isNull(awsConfig)) { + throw new RuntimeException("AWS Config needs to be specified when sasl/aws_msk_iam is set to \"role\""); + } String baseIamAuthConfig = "software.amazon.msk.auth.iam.IAMLoginModule required " + "awsRoleArn=\"%s\" " + "awsStsRegion=\"%s\""; @@ -225,14 +229,16 @@ public static void setAwsIamAuthProperties(Properties properties, final AwsIamAu } } - public static String getBootStrapServersForMsk(final AwsIamAuthConfig awsIamAuthConfig, final AwsConfig awsConfig, final Logger LOG) { - if (awsIamAuthConfig == AwsIamAuthConfig.ROLE) { + private static void configureMSKCredentialsProvider(final AuthConfig authConfig, final AwsConfig awsConfig) { + mskCredentialsProvider = DefaultCredentialsProvider.create(); + if (Objects.nonNull(authConfig) && Objects.nonNull(authConfig.getSaslAuthConfig()) && + authConfig.getSaslAuthConfig().getAwsIamAuthConfig() == AwsIamAuthConfig.ROLE) { String sessionName = "data-prepper-kafka-session" + UUID.randomUUID(); StsClient stsClient = StsClient.builder() .region(Region.of(awsConfig.getRegion())) - .credentialsProvider(credentialsProvider) + .credentialsProvider(mskCredentialsProvider) .build(); - credentialsProvider = StsAssumeRoleCredentialsProvider + mskCredentialsProvider = StsAssumeRoleCredentialsProvider .builder() .stsClient(stsClient) .refreshRequest( @@ -242,12 +248,15 @@ public static String getBootStrapServersForMsk(final AwsIamAuthConfig awsIamAuth .roleSessionName(sessionName) .build() ).build(); - } else if (awsIamAuthConfig != AwsIamAuthConfig.DEFAULT) { - throw new RuntimeException("Unknown AWS IAM auth mode"); } + } + + public static String getBootStrapServersForMsk(final AwsConfig awsConfig, + final AwsCredentialsProvider mskCredentialsProvider, + final Logger log) { final AwsConfig.AwsMskConfig awsMskConfig = awsConfig.getAwsMskConfig(); KafkaClient kafkaClient = KafkaClient.builder() - .credentialsProvider(credentialsProvider) + .credentialsProvider(mskCredentialsProvider) .region(Region.of(awsConfig.getRegion())) .build(); final GetBootstrapBrokersRequest request = @@ -264,7 +273,7 @@ public static String getBootStrapServersForMsk(final AwsIamAuthConfig awsIamAuth try { result = kafkaClient.getBootstrapBrokers(request); } catch (KafkaException | StsException e) { - LOG.info("Failed to get bootstrap server information from MSK. Will try every 10 seconds for {} seconds", 10*MAX_KAFKA_CLIENT_RETRIES, e); + log.info("Failed to get bootstrap server information from MSK. Will try every 10 seconds for {} seconds", 10*MAX_KAFKA_CLIENT_RETRIES, e); try { Thread.sleep(10000); } catch (InterruptedException exp) {} @@ -302,16 +311,19 @@ public static void setDynamicSaslClientCallbackHandler(final Properties properti } } } - public static void setAuthProperties(final Properties properties, final KafkaClusterAuthConfig kafkaClusterAuthConfig, final Logger LOG) { + public static void setAuthProperties(final Properties properties, final KafkaClusterAuthConfig kafkaClusterAuthConfig, final Logger log) { final AwsConfig awsConfig = kafkaClusterAuthConfig.getAwsConfig(); final AuthConfig authConfig = kafkaClusterAuthConfig.getAuthConfig(); final EncryptionConfig encryptionConfig = kafkaClusterAuthConfig.getEncryptionConfig(); - credentialsProvider = DefaultCredentialsProvider.create(); + configureMSKCredentialsProvider(authConfig, awsConfig); String bootstrapServers = ""; if (Objects.nonNull(kafkaClusterAuthConfig.getBootstrapServers())) { bootstrapServers = String.join(",", kafkaClusterAuthConfig.getBootstrapServers()); } + if (Objects.nonNull(awsConfig) && Objects.nonNull(awsConfig.getAwsMskConfig())) { + bootstrapServers = getBootStrapServersForMsk(awsConfig, mskCredentialsProvider, log); + } if (Objects.nonNull(authConfig)) { final AuthConfig.SaslAuthConfig saslAuthConfig = authConfig.getSaslAuthConfig(); @@ -323,11 +335,7 @@ public static void setAuthProperties(final Properties properties, final KafkaClu if (checkEncryptionType(encryptionConfig, EncryptionType.NONE)) { throw new RuntimeException("Encryption Config must be SSL to use IAM authentication mechanism"); } - if (Objects.isNull(awsConfig)) { - throw new RuntimeException("AWS Config is not specified"); - } setAwsIamAuthProperties(properties, awsIamAuthConfig, awsConfig); - bootstrapServers = getBootStrapServersForMsk(awsIamAuthConfig, awsConfig, LOG); } else if (Objects.nonNull(saslAuthConfig.getOAuthConfig())) { setOauthProperties(kafkaClusterAuthConfig, properties); } else if (Objects.nonNull(plainTextAuthConfig) && Objects.nonNull(kafkaClusterAuthConfig.getEncryptionConfig())) { @@ -358,19 +366,45 @@ private static boolean checkEncryptionType(final EncryptionConfig encryptionConf } public static GlueSchemaRegistryKafkaDeserializer getGlueSerializer(final KafkaConsumerConfig kafkaConsumerConfig) { + configureAwsGlueCredentialsProvider(kafkaConsumerConfig.getAwsConfig()); SchemaConfig schemaConfig = kafkaConsumerConfig.getSchemaConfig(); if (Objects.isNull(schemaConfig) || schemaConfig.getType() != SchemaRegistryType.AWS_GLUE) { return null; } Map configs = new HashMap<>(); - configs.put(AWSSchemaRegistryConstants.AWS_REGION, kafkaConsumerConfig.getAwsConfig().getRegion()); + final AwsConfig awsConfig = kafkaConsumerConfig.getAwsConfig(); + if (Objects.nonNull(awsConfig) && Objects.nonNull(awsConfig.getRegion())) { + configs.put(AWSSchemaRegistryConstants.AWS_REGION, kafkaConsumerConfig.getAwsConfig().getRegion()); + } configs.put(AWSSchemaRegistryConstants.AVRO_RECORD_TYPE, AvroRecordType.GENERIC_RECORD.getName()); configs.put(AWSSchemaRegistryConstants.CACHE_TIME_TO_LIVE_MILLIS, "86400000"); configs.put(AWSSchemaRegistryConstants.CACHE_SIZE, "10"); configs.put(AWSSchemaRegistryConstants.COMPATIBILITY_SETTING, Compatibility.FULL); - glueDeserializer = new GlueSchemaRegistryKafkaDeserializer(credentialsProvider, configs); + glueDeserializer = new GlueSchemaRegistryKafkaDeserializer(awsGlueCredentialsProvider, configs); return glueDeserializer; } + private static void configureAwsGlueCredentialsProvider(final AwsConfig awsConfig) { + awsGlueCredentialsProvider = DefaultCredentialsProvider.create(); + if (Objects.nonNull(awsConfig) && + Objects.nonNull(awsConfig.getRegion()) && Objects.nonNull(awsConfig.getStsRoleArn())) { + String sessionName = "data-prepper-kafka-session" + UUID.randomUUID(); + StsClient stsClient = StsClient.builder() + .region(Region.of(awsConfig.getRegion())) + .credentialsProvider(awsGlueCredentialsProvider) + .build(); + awsGlueCredentialsProvider = StsAssumeRoleCredentialsProvider + .builder() + .stsClient(stsClient) + .refreshRequest( + AssumeRoleRequest + .builder() + .roleArn(awsConfig.getStsRoleArn()) + .roleSessionName(sessionName) + .build() + ).build(); + } + } + } diff --git a/data-prepper-plugins/kafka-plugins/src/test/java/org/opensearch/dataprepper/plugins/kafka/common/thread/KafkaPluginThreadFactoryTest.java b/data-prepper-plugins/kafka-plugins/src/test/java/org/opensearch/dataprepper/plugins/kafka/common/thread/KafkaPluginThreadFactoryTest.java index 589f81a74c..1f1bc854dc 100644 --- a/data-prepper-plugins/kafka-plugins/src/test/java/org/opensearch/dataprepper/plugins/kafka/common/thread/KafkaPluginThreadFactoryTest.java +++ b/data-prepper-plugins/kafka-plugins/src/test/java/org/opensearch/dataprepper/plugins/kafka/common/thread/KafkaPluginThreadFactoryTest.java @@ -8,6 +8,8 @@ import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; +import org.junit.jupiter.params.ParameterizedTest; +import org.junit.jupiter.params.provider.CsvSource; import org.mockito.ArgumentCaptor; import org.mockito.Mock; import org.mockito.junit.jupiter.MockitoExtension; @@ -37,10 +39,12 @@ class KafkaPluginThreadFactoryTest { @Mock private Runnable runnable; private String pluginType; + private String topic; @BeforeEach void setUp() { pluginType = UUID.randomUUID().toString(); + topic = UUID.randomUUID().toString(); when(delegateThreadFactory.newThread(any(Runnable.class))).thenReturn(innerThread); } @@ -50,11 +54,20 @@ private KafkaPluginThreadFactory createObjectUnderTest() { return new KafkaPluginThreadFactory(delegateThreadFactory, pluginType); } + private KafkaPluginThreadFactory createObjectUnderTestWithTopic() { + return new KafkaPluginThreadFactory(delegateThreadFactory, pluginType, topic); + } + @Test void newThread_creates_thread_from_delegate() { assertThat(createObjectUnderTest().newThread(runnable), equalTo(innerThread)); } + @Test + void newThread_with_topic_creates_thread_from_delegate() { + assertThat(createObjectUnderTestWithTopic().newThread(runnable), equalTo(innerThread)); + } + @Test void newThread_creates_thread_with_name() { final KafkaPluginThreadFactory objectUnderTest = createObjectUnderTest(); @@ -69,6 +82,30 @@ void newThread_creates_thread_with_name() { verify(thread2).setName(String.format("kafka-%s-2", pluginType)); } + @ParameterizedTest + @CsvSource({ + "abcd12,abcd12", + "aBCd12,abcd12", + "abcd-12,abcd-12", + "has space,has-space", + "has!character,has-character", + "this-is-somewhat-too-long,this-is-somewhat-too" + }) + void newThread_with_topic_creates_thread_with_name( + final String topicName, + final String expectedPrefix) { + this.topic = topicName; + final KafkaPluginThreadFactory objectUnderTest = createObjectUnderTestWithTopic(); + + final Thread thread1 = objectUnderTest.newThread(runnable); + assertThat(thread1, notNullValue()); + verify(thread1).setName(String.format("%s-kafka-%s-1", expectedPrefix, pluginType)); + + final Thread thread2 = objectUnderTest.newThread(runnable); + assertThat(thread2, notNullValue()); + verify(thread2).setName(String.format("%s-kafka-%s-2", expectedPrefix, pluginType)); + } + @Test void newThread_creates_thread_with_wrapping_runnable() { createObjectUnderTest().newThread(runnable); @@ -85,6 +122,22 @@ void newThread_creates_thread_with_wrapping_runnable() { verify(runnable).run(); } + @Test + void newThread_with_topic_creates_thread_with_wrapping_runnable() { + createObjectUnderTestWithTopic().newThread(runnable); + + final ArgumentCaptor actualRunnableCaptor = ArgumentCaptor.forClass(Runnable.class); + verify(delegateThreadFactory).newThread(actualRunnableCaptor.capture()); + + final Runnable actualRunnable = actualRunnableCaptor.getValue(); + + assertThat(actualRunnable, not(equalTo(runnable))); + + verifyNoInteractions(runnable); + actualRunnable.run(); + verify(runnable).run(); + } + @Test void newThread_creates_thread_that_calls_MDC_on_run() { createObjectUnderTest().newThread(runnable); @@ -104,4 +157,24 @@ void newThread_creates_thread_that_calls_MDC_on_run() { assertThat(actualKafkaPluginType[0], equalTo(pluginType)); } + + @Test + void newThread_with_topic_creates_thread_that_calls_MDC_on_run() { + createObjectUnderTestWithTopic().newThread(runnable); + + final ArgumentCaptor actualRunnableCaptor = ArgumentCaptor.forClass(Runnable.class); + verify(delegateThreadFactory).newThread(actualRunnableCaptor.capture()); + + final Runnable actualRunnable = actualRunnableCaptor.getValue(); + + final String[] actualKafkaPluginType = new String[1]; + doAnswer(a -> { + actualKafkaPluginType[0] = MDC.get(KafkaMdc.MDC_KAFKA_PLUGIN_KEY); + return null; + }).when(runnable).run(); + + actualRunnable.run(); + + assertThat(actualKafkaPluginType[0], equalTo(pluginType)); + } } \ No newline at end of file diff --git a/data-prepper-plugins/kafka-plugins/src/test/java/org/opensearch/dataprepper/plugins/kafka/source/KafkaSourceTest.java b/data-prepper-plugins/kafka-plugins/src/test/java/org/opensearch/dataprepper/plugins/kafka/source/KafkaSourceTest.java index 1503a7424d..3433a92b76 100644 --- a/data-prepper-plugins/kafka-plugins/src/test/java/org/opensearch/dataprepper/plugins/kafka/source/KafkaSourceTest.java +++ b/data-prepper-plugins/kafka-plugins/src/test/java/org/opensearch/dataprepper/plugins/kafka/source/KafkaSourceTest.java @@ -6,11 +6,14 @@ package org.opensearch.dataprepper.plugins.kafka.source; import org.apache.kafka.common.config.ConfigException; +import org.junit.jupiter.api.AfterEach; import org.junit.jupiter.api.Assertions; import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Nested; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; import org.mockito.Mock; +import org.mockito.MockedStatic; import org.mockito.junit.jupiter.MockitoExtension; import org.mockito.junit.jupiter.MockitoSettings; import org.mockito.quality.Strictness; @@ -21,6 +24,7 @@ import org.opensearch.dataprepper.model.event.Event; import org.opensearch.dataprepper.model.plugin.PluginConfigObservable; import org.opensearch.dataprepper.model.record.Record; +import org.opensearch.dataprepper.plugins.kafka.common.KafkaMdc; import org.opensearch.dataprepper.plugins.kafka.configuration.AuthConfig; import org.opensearch.dataprepper.plugins.kafka.configuration.AwsConfig; import org.opensearch.dataprepper.plugins.kafka.configuration.TopicConsumerConfig; @@ -29,6 +33,7 @@ import org.opensearch.dataprepper.plugins.kafka.configuration.SchemaConfig; import org.opensearch.dataprepper.plugins.kafka.extension.KafkaClusterConfigSupplier; import org.opensearch.dataprepper.plugins.kafka.util.MessageFormat; +import org.slf4j.MDC; import java.time.Duration; import java.util.Collections; @@ -41,6 +46,7 @@ import static org.mockito.Mockito.doNothing; import static org.mockito.Mockito.doThrow; import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.mockStatic; import static org.mockito.Mockito.never; import static org.mockito.Mockito.spy; import static org.mockito.Mockito.verify; @@ -230,4 +236,38 @@ void test_updateConfig_not_using_kafkaClusterConfigExtension() { verify(sourceConfig, never()).setAwsConfig(any()); verify(sourceConfig, never()).setEncryptionConfig(any()); } + + @Nested + class MdcTests { + private MockedStatic mdcMockedStatic; + + @BeforeEach + void setUp() { + mdcMockedStatic = mockStatic(MDC.class); + } + + @AfterEach + void tearDown() { + mdcMockedStatic.close(); + } + + @Test + void start_sets_and_removes_MDC() { + when(topic1.getSessionTimeOut()).thenReturn(Duration.ofSeconds(15)); + when(topic2.getSessionTimeOut()).thenReturn(Duration.ofSeconds(15)); + + createObjectUnderTest().start(buffer); + + mdcMockedStatic.verify(() -> MDC.put(KafkaMdc.MDC_KAFKA_PLUGIN_KEY, "source")); + mdcMockedStatic.verify(() -> MDC.remove(KafkaMdc.MDC_KAFKA_PLUGIN_KEY)); + } + + @Test + void stop_sets_and_removes_MDC() { + createObjectUnderTest().stop(); + + mdcMockedStatic.verify(() -> MDC.put(KafkaMdc.MDC_KAFKA_PLUGIN_KEY, "source")); + mdcMockedStatic.verify(() -> MDC.remove(KafkaMdc.MDC_KAFKA_PLUGIN_KEY)); + } + } } diff --git a/data-prepper-plugins/kafka-plugins/src/test/java/org/opensearch/dataprepper/plugins/kafka/util/KafkaSecurityConfigurerTest.java b/data-prepper-plugins/kafka-plugins/src/test/java/org/opensearch/dataprepper/plugins/kafka/util/KafkaSecurityConfigurerTest.java index f1a9af8436..298457e21e 100644 --- a/data-prepper-plugins/kafka-plugins/src/test/java/org/opensearch/dataprepper/plugins/kafka/util/KafkaSecurityConfigurerTest.java +++ b/data-prepper-plugins/kafka-plugins/src/test/java/org/opensearch/dataprepper/plugins/kafka/util/KafkaSecurityConfigurerTest.java @@ -1,8 +1,11 @@ package org.opensearch.dataprepper.plugins.kafka.util; +import com.amazonaws.services.schemaregistry.deserializers.GlueSchemaRegistryKafkaDeserializer; import com.fasterxml.jackson.databind.ObjectMapper; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; +import org.junit.jupiter.params.ParameterizedTest; +import org.junit.jupiter.params.provider.ValueSource; import org.mockito.ArgumentCaptor; import org.mockito.Captor; import org.mockito.Mock; @@ -19,6 +22,14 @@ import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.yaml.snakeyaml.Yaml; +import software.amazon.awssdk.auth.credentials.DefaultCredentialsProvider; +import software.amazon.awssdk.regions.Region; +import software.amazon.awssdk.regions.providers.DefaultAwsRegionProviderChain; +import software.amazon.awssdk.services.kafka.KafkaClient; +import software.amazon.awssdk.services.kafka.KafkaClientBuilder; +import software.amazon.awssdk.services.kafka.model.GetBootstrapBrokersRequest; +import software.amazon.awssdk.services.kafka.model.GetBootstrapBrokersResponse; +import software.amazon.awssdk.services.sts.auth.StsAssumeRoleCredentialsProvider; import java.io.FileReader; import java.io.IOException; @@ -27,12 +38,16 @@ import java.util.Map; import java.util.Objects; import java.util.Properties; +import java.util.UUID; import static org.apache.kafka.common.config.SaslConfigs.SASL_CLIENT_CALLBACK_HANDLER_CLASS; import static org.hamcrest.CoreMatchers.equalTo; +import static org.hamcrest.CoreMatchers.instanceOf; +import static org.hamcrest.CoreMatchers.notNullValue; import static org.hamcrest.CoreMatchers.nullValue; import static org.hamcrest.MatcherAssert.assertThat; import static org.hamcrest.CoreMatchers.is; +import static org.mockito.ArgumentMatchers.any; import static org.mockito.Mockito.mock; import static org.mockito.Mockito.mockStatic; import static org.mockito.Mockito.verify; @@ -128,6 +143,136 @@ public void testSetAuthPropertiesAuthSslWithNoCertContentNoTrustStore() throws E assertThat(props.get("ssl.engine.factory.class"), is(nullValue())); } + @Test + public void testSetAuthPropertiesBootstrapServersWithSaslIAMRole() throws IOException { + final Properties props = new Properties(); + final KafkaSourceConfig kafkaSourceConfig = createKafkaSinkConfig("kafka-pipeline-bootstrap-servers-sasl-iam-role.yaml"); + KafkaSecurityConfigurer.setAuthProperties(props, kafkaSourceConfig, LOG); + assertThat(props.getProperty("bootstrap.servers"), is("localhost:9092")); + assertThat(props.getProperty("sasl.mechanism"), is("AWS_MSK_IAM")); + assertThat(props.getProperty("sasl.jaas.config"), + is("software.amazon.msk.auth.iam.IAMLoginModule required " + + "awsRoleArn=\"test_sasl_iam_sts_role\" awsStsRegion=\"us-east-2\";")); + assertThat(props.getProperty("security.protocol"), is("SASL_SSL")); + assertThat(props.getProperty("certificateContent"), is(nullValue())); + assertThat(props.getProperty("ssl.truststore.location"), is(nullValue())); + assertThat(props.getProperty("ssl.truststore.password"), is(nullValue())); + assertThat(props.get("ssl.engine.factory.class"), is(nullValue())); + assertThat(props.get("sasl.client.callback.handler.class"), + is("software.amazon.msk.auth.iam.IAMClientCallbackHandler")); + } + + @Test + public void testSetAuthPropertiesBootstrapServersWithSaslIAMDefault() throws IOException { + final Properties props = new Properties(); + final KafkaSourceConfig kafkaSourceConfig = createKafkaSinkConfig("kafka-pipeline-bootstrap-servers-sasl-iam-default.yaml"); + KafkaSecurityConfigurer.setAuthProperties(props, kafkaSourceConfig, LOG); + assertThat(props.getProperty("bootstrap.servers"), is("localhost:9092")); + assertThat(props.getProperty("sasl.jaas.config"), is("software.amazon.msk.auth.iam.IAMLoginModule required;")); + assertThat(props.getProperty("sasl.mechanism"), is("AWS_MSK_IAM")); + assertThat(props.getProperty("security.protocol"), is("SASL_SSL")); + assertThat(props.getProperty("certificateContent"), is(nullValue())); + assertThat(props.getProperty("ssl.truststore.location"), is(nullValue())); + assertThat(props.getProperty("ssl.truststore.password"), is(nullValue())); + assertThat(props.get("ssl.engine.factory.class"), is(nullValue())); + assertThat(props.get("sasl.client.callback.handler.class"), + is("software.amazon.msk.auth.iam.IAMClientCallbackHandler")); + } + + @Test + public void testSetAuthPropertiesBootstrapServersOverrideByMSK() throws IOException { + final String testMSKEndpoint = UUID.randomUUID().toString(); + final Properties props = new Properties(); + final KafkaSourceConfig kafkaSourceConfig = createKafkaSinkConfig("kafka-pipeline-bootstrap-servers-override-by-msk.yaml"); + final KafkaClientBuilder kafkaClientBuilder = mock(KafkaClientBuilder.class); + final KafkaClient kafkaClient = mock(KafkaClient.class); + when(kafkaClientBuilder.credentialsProvider(any())).thenReturn(kafkaClientBuilder); + when(kafkaClientBuilder.region(any(Region.class))).thenReturn(kafkaClientBuilder); + when(kafkaClientBuilder.build()).thenReturn(kafkaClient); + final GetBootstrapBrokersResponse response = mock(GetBootstrapBrokersResponse.class); + when(response.bootstrapBrokerStringSaslIam()).thenReturn(testMSKEndpoint); + when(kafkaClient.getBootstrapBrokers(any(GetBootstrapBrokersRequest.class))).thenReturn(response); + try (MockedStatic mockedKafkaClient = mockStatic(KafkaClient.class)) { + mockedKafkaClient.when(KafkaClient::builder).thenReturn(kafkaClientBuilder); + KafkaSecurityConfigurer.setAuthProperties(props, kafkaSourceConfig, LOG); + } + assertThat(props.getProperty("bootstrap.servers"), is(testMSKEndpoint)); + assertThat(props.getProperty("sasl.mechanism"), is("AWS_MSK_IAM")); + assertThat(props.getProperty("sasl.jaas.config"), + is("software.amazon.msk.auth.iam.IAMLoginModule required awsRoleArn=\"sts_role_arn\" awsStsRegion=\"us-east-2\";")); + assertThat(props.getProperty("security.protocol"), is("SASL_SSL")); + assertThat(props.getProperty("certificateContent"), is(nullValue())); + assertThat(props.getProperty("ssl.truststore.location"), is(nullValue())); + assertThat(props.getProperty("ssl.truststore.password"), is(nullValue())); + assertThat(props.get("ssl.engine.factory.class"), is(nullValue())); + assertThat(props.get("sasl.client.callback.handler.class"), + is("software.amazon.msk.auth.iam.IAMClientCallbackHandler")); + } + + @Test + public void testSetAuthPropertiesMskWithSaslPlain() throws IOException { + final String testMSKEndpoint = UUID.randomUUID().toString(); + final Properties props = new Properties(); + final KafkaSourceConfig kafkaSourceConfig = createKafkaSinkConfig("kafka-pipeline-msk-sasl-plain.yaml"); + final KafkaClientBuilder kafkaClientBuilder = mock(KafkaClientBuilder.class); + final KafkaClient kafkaClient = mock(KafkaClient.class); + when(kafkaClientBuilder.credentialsProvider(any())).thenReturn(kafkaClientBuilder); + when(kafkaClientBuilder.region(any(Region.class))).thenReturn(kafkaClientBuilder); + when(kafkaClientBuilder.build()).thenReturn(kafkaClient); + final GetBootstrapBrokersResponse response = mock(GetBootstrapBrokersResponse.class); + when(response.bootstrapBrokerStringSaslIam()).thenReturn(testMSKEndpoint); + when(kafkaClient.getBootstrapBrokers(any(GetBootstrapBrokersRequest.class))).thenReturn(response); + try (MockedStatic mockedKafkaClient = mockStatic(KafkaClient.class)) { + mockedKafkaClient.when(KafkaClient::builder).thenReturn(kafkaClientBuilder); + KafkaSecurityConfigurer.setAuthProperties(props, kafkaSourceConfig, LOG); + } + assertThat(props.getProperty("bootstrap.servers"), is(testMSKEndpoint)); + assertThat(props.getProperty("sasl.mechanism"), is("PLAIN")); + assertThat(props.getProperty("sasl.jaas.config"), + is("org.apache.kafka.common.security.plain.PlainLoginModule required " + + "username=\"test_sasl_username\" password=\"test_sasl_password\";")); + assertThat(props.getProperty("security.protocol"), is("SASL_SSL")); + assertThat(props.getProperty("certificateContent"), is(nullValue())); + assertThat(props.getProperty("ssl.truststore.location"), is(nullValue())); + assertThat(props.getProperty("ssl.truststore.password"), is(nullValue())); + assertThat(props.get("ssl.engine.factory.class"), is(nullValue())); + } + + @ParameterizedTest + @ValueSource(strings = { + "kafka-pipeline-bootstrap-servers-glue-sts-assume-role.yaml", + "kafka-pipeline-msk-default-glue-sts-assume-role.yaml" + }) + void testGetGlueSerializerWithStsAssumeRoleCredentialsProvider(final String filename) throws IOException { + final KafkaSourceConfig kafkaSourceConfig = createKafkaSinkConfig(filename); + final GlueSchemaRegistryKafkaDeserializer glueSchemaRegistryKafkaDeserializer = KafkaSecurityConfigurer + .getGlueSerializer(kafkaSourceConfig); + assertThat(glueSchemaRegistryKafkaDeserializer, notNullValue()); + assertThat(glueSchemaRegistryKafkaDeserializer.getCredentialProvider(), + instanceOf(StsAssumeRoleCredentialsProvider.class)); + } + + @Test + void testGetGlueSerializerWithDefaultCredentialsProvider() throws IOException { + final KafkaSourceConfig kafkaSourceConfig = createKafkaSinkConfig( + "kafka-pipeline-bootstrap-servers-glue-default.yaml"); + final DefaultAwsRegionProviderChain.Builder defaultAwsRegionProviderChainBuilder = mock( + DefaultAwsRegionProviderChain.Builder.class); + final DefaultAwsRegionProviderChain defaultAwsRegionProviderChain = mock(DefaultAwsRegionProviderChain.class); + when(defaultAwsRegionProviderChainBuilder.build()).thenReturn(defaultAwsRegionProviderChain); + when(defaultAwsRegionProviderChain.getRegion()).thenReturn(Region.US_EAST_1); + try (MockedStatic defaultAwsRegionProviderChainMockedStatic = + mockStatic(DefaultAwsRegionProviderChain.class)) { + defaultAwsRegionProviderChainMockedStatic.when(DefaultAwsRegionProviderChain::builder) + .thenReturn(defaultAwsRegionProviderChainBuilder); + final GlueSchemaRegistryKafkaDeserializer glueSchemaRegistryKafkaDeserializer = KafkaSecurityConfigurer + .getGlueSerializer(kafkaSourceConfig); + assertThat(glueSchemaRegistryKafkaDeserializer, notNullValue()); + assertThat(glueSchemaRegistryKafkaDeserializer.getCredentialProvider(), + instanceOf(DefaultCredentialsProvider.class)); + } + } + @Test void testSetDynamicSaslClientCallbackHandlerWithNonNullPlainTextAuthConfig() { when(kafkaConnectionConfig.getAuthConfig()).thenReturn(authConfig); diff --git a/data-prepper-plugins/kafka-plugins/src/test/resources/kafka-pipeline-bootstrap-servers-glue-default.yaml b/data-prepper-plugins/kafka-plugins/src/test/resources/kafka-pipeline-bootstrap-servers-glue-default.yaml new file mode 100644 index 0000000000..5017333415 --- /dev/null +++ b/data-prepper-plugins/kafka-plugins/src/test/resources/kafka-pipeline-bootstrap-servers-glue-default.yaml @@ -0,0 +1,14 @@ +log-pipeline : + source: + kafka: + bootstrap_servers: + - "localhost:9092" + encryption: + type: "SSL" + schema: + type: aws_glue + topics: + - name: "quickstart-events" + group_id: "groupdID1" + sink: + stdout: \ No newline at end of file diff --git a/data-prepper-plugins/kafka-plugins/src/test/resources/kafka-pipeline-bootstrap-servers-glue-sts-assume-role.yaml b/data-prepper-plugins/kafka-plugins/src/test/resources/kafka-pipeline-bootstrap-servers-glue-sts-assume-role.yaml new file mode 100644 index 0000000000..4fc036a9de --- /dev/null +++ b/data-prepper-plugins/kafka-plugins/src/test/resources/kafka-pipeline-bootstrap-servers-glue-sts-assume-role.yaml @@ -0,0 +1,17 @@ +log-pipeline : + source: + kafka: + bootstrap_servers: + - "localhost:9092" + encryption: + type: "SSL" + aws: + region: us-east-2 + sts_role_arn: sts_role_arn + schema: + type: aws_glue + topics: + - name: "quickstart-events" + group_id: "groupdID1" + sink: + stdout: \ No newline at end of file diff --git a/data-prepper-plugins/kafka-plugins/src/test/resources/kafka-pipeline-bootstrap-servers-override-by-msk.yaml b/data-prepper-plugins/kafka-plugins/src/test/resources/kafka-pipeline-bootstrap-servers-override-by-msk.yaml new file mode 100644 index 0000000000..889fd0c044 --- /dev/null +++ b/data-prepper-plugins/kafka-plugins/src/test/resources/kafka-pipeline-bootstrap-servers-override-by-msk.yaml @@ -0,0 +1,20 @@ +log-pipeline : + source: + kafka: + bootstrap_servers: + - "localhost:9092" + encryption: + type: "SSL" + authentication: + sasl: + aws_msk_iam: role + aws: + region: us-east-2 + sts_role_arn: sts_role_arn + msk: + arn: service Arn + topics: + - name: "quickstart-events" + group_id: "groupdID1" + sink: + stdout: \ No newline at end of file diff --git a/data-prepper-plugins/kafka-plugins/src/test/resources/kafka-pipeline-bootstrap-servers-sasl-iam-default.yaml b/data-prepper-plugins/kafka-plugins/src/test/resources/kafka-pipeline-bootstrap-servers-sasl-iam-default.yaml new file mode 100644 index 0000000000..0edc808ce3 --- /dev/null +++ b/data-prepper-plugins/kafka-plugins/src/test/resources/kafka-pipeline-bootstrap-servers-sasl-iam-default.yaml @@ -0,0 +1,15 @@ +log-pipeline : + source: + kafka: + bootstrap_servers: + - "localhost:9092" + encryption: + type: "SSL" + authentication: + sasl: + aws_msk_iam: default + topics: + - name: "quickstart-events" + group_id: "groupdID1" + sink: + stdout: \ No newline at end of file diff --git a/data-prepper-plugins/kafka-plugins/src/test/resources/kafka-pipeline-bootstrap-servers-sasl-iam-role.yaml b/data-prepper-plugins/kafka-plugins/src/test/resources/kafka-pipeline-bootstrap-servers-sasl-iam-role.yaml new file mode 100644 index 0000000000..a4ef7fd94b --- /dev/null +++ b/data-prepper-plugins/kafka-plugins/src/test/resources/kafka-pipeline-bootstrap-servers-sasl-iam-role.yaml @@ -0,0 +1,18 @@ +log-pipeline : + source: + kafka: + bootstrap_servers: + - "localhost:9092" + encryption: + type: "SSL" + authentication: + sasl: + aws_msk_iam: role + aws: + region: us-east-2 + sts_role_arn: test_sasl_iam_sts_role + topics: + - name: "quickstart-events" + group_id: "groupdID1" + sink: + stdout: \ No newline at end of file diff --git a/data-prepper-plugins/kafka-plugins/src/test/resources/kafka-pipeline-msk-default-glue-sts-assume-role.yaml b/data-prepper-plugins/kafka-plugins/src/test/resources/kafka-pipeline-msk-default-glue-sts-assume-role.yaml new file mode 100644 index 0000000000..bf94287f26 --- /dev/null +++ b/data-prepper-plugins/kafka-plugins/src/test/resources/kafka-pipeline-msk-default-glue-sts-assume-role.yaml @@ -0,0 +1,20 @@ +log-pipeline : + source: + kafka: + encryption: + type: "SSL" + authentication: + sasl: + aws_msk_iam: default + aws: + region: us-east-2 + sts_role_arn: sts_role_arn + msk: + arn: service Arn + schema: + type: aws_glue + topics: + - name: "quickstart-events" + group_id: "groupdID1" + sink: + stdout: \ No newline at end of file diff --git a/data-prepper-plugins/kafka-plugins/src/test/resources/kafka-pipeline-msk-sasl-plain.yaml b/data-prepper-plugins/kafka-plugins/src/test/resources/kafka-pipeline-msk-sasl-plain.yaml new file mode 100644 index 0000000000..f1a44ff414 --- /dev/null +++ b/data-prepper-plugins/kafka-plugins/src/test/resources/kafka-pipeline-msk-sasl-plain.yaml @@ -0,0 +1,20 @@ +log-pipeline : + source: + kafka: + encryption: + type: "SSL" + authentication: + sasl: + plain: + username: test_sasl_username + password: test_sasl_password + aws: + region: us-east-2 + sts_role_arn: sts_role_arn + msk: + arn: service Arn + topics: + - name: "quickstart-events" + group_id: "groupdID1" + sink: + stdout: \ No newline at end of file diff --git a/data-prepper-plugins/key-value-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/keyvalue/KeyValueProcessor.java b/data-prepper-plugins/key-value-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/keyvalue/KeyValueProcessor.java index ea3a7accdb..c42e015829 100644 --- a/data-prepper-plugins/key-value-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/keyvalue/KeyValueProcessor.java +++ b/data-prepper-plugins/key-value-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/keyvalue/KeyValueProcessor.java @@ -281,19 +281,22 @@ private void addPart(List parts, final String str, final int start, fina } } - public int findInStartGroup(final String str, int idx) { + private int findInStartGroup(final String str, final int idx) { + if (idx < 0 || idx >= str.length()) { + return -1; // Invalid starting index + } + for (int j = 0; j < startGroupStrings.length; j++) { - try { - if (startGroupStrings[j].equals(str.substring(idx, idx+startGroupStrings[j].length()))) { - // For " and ', make sure, it's not escaped - if (j <= 1 && (idx == 0 || str.charAt(idx-1) != '\\')) { - return j; - } else if (j > 1) { - return j; - } + String startGroup = startGroupStrings[j]; + int startGroupLen = startGroup.length(); + + if (idx + startGroupLen <= str.length() && str.startsWith(startGroup, idx)) { + // For the first two elements, check for escape characters + if (j <= 1 && (idx == 0 || str.charAt(idx - 1) != '\\')) { + return j; + } else if (j > 1) { + return j; } - } catch (Exception e) { - return -1; } } return -1; diff --git a/data-prepper-plugins/key-value-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/keyvalue/KeyValueProcessorConfig.java b/data-prepper-plugins/key-value-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/keyvalue/KeyValueProcessorConfig.java index 84cdb868e9..bcc8eb0a27 100644 --- a/data-prepper-plugins/key-value-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/keyvalue/KeyValueProcessorConfig.java +++ b/data-prepper-plugins/key-value-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/keyvalue/KeyValueProcessorConfig.java @@ -6,6 +6,7 @@ package org.opensearch.dataprepper.plugins.processor.keyvalue; import com.fasterxml.jackson.annotation.JsonProperty; +import com.fasterxml.jackson.annotation.JsonPropertyDescription; import jakarta.validation.constraints.NotEmpty; import jakarta.validation.constraints.NotNull; import jakarta.validation.constraints.AssertTrue; @@ -35,87 +36,163 @@ public class KeyValueProcessorConfig { static final boolean DEFAULT_RECURSIVE = false; @NotEmpty + @JsonPropertyDescription("The message field to be parsed. Optional. Default value is `message`.") private String source = DEFAULT_SOURCE; + @JsonPropertyDescription("The destination field for the parsed source. The parsed source overwrites the " + + "preexisting data for that key. Optional. If `destination` is set to `null`, the parsed fields will be " + + "written to the root of the event. Default value is `parsed_message`.") private String destination = DEFAULT_DESTINATION; @JsonProperty("field_delimiter_regex") + @JsonPropertyDescription("A regular expression specifying the delimiter that separates key-value pairs. " + + "Special regular expression characters such as `[` and `]` must be escaped with `\\\\`. " + + "Cannot be defined at the same time as `field_split_characters`. Optional. " + + "If this option is not defined, `field_split_characters` is used.") private String fieldDelimiterRegex; @JsonProperty("field_split_characters") + @JsonPropertyDescription("A string of characters specifying the delimiter that separates key-value pairs. " + + "Special regular expression characters such as `[` and `]` must be escaped with `\\\\`. " + + "Cannot be defined at the same time as `field_delimiter_regex`. Optional. Default value is `&`.") private String fieldSplitCharacters = DEFAULT_FIELD_SPLIT_CHARACTERS; @JsonProperty("include_keys") + @JsonPropertyDescription("An array specifying the keys that should be added for parsing. " + + "By default, all keys will be added.") @NotNull private List includeKeys = DEFAULT_INCLUDE_KEYS; @JsonProperty("exclude_keys") + @JsonPropertyDescription("An array specifying the parsed keys that should not be added to the event. " + + "By default, no keys will be excluded.") @NotNull private List excludeKeys = DEFAULT_EXCLUDE_KEYS; @JsonProperty("default_values") + @JsonPropertyDescription("A map specifying the default keys and their values that should be added " + + "to the event in case these keys do not exist in the source field being parsed. " + + "If the default key already exists in the message, the value is not changed. " + + "The `include_keys` filter will be applied to the message before `default_values`.") @NotNull private Map defaultValues = DEFAULT_DEFAULT_VALUES; @JsonProperty("key_value_delimiter_regex") + @JsonPropertyDescription("A regular expression specifying the delimiter that separates the key and value " + + "within a key-value pair. Special regular expression characters such as `[` and `]` must be escaped with " + + "`\\\\`. This option cannot be defined at the same time as `value_split_characters`. Optional. " + + "If this option is not defined, `value_split_characters` is used.") private String keyValueDelimiterRegex; @JsonProperty("value_split_characters") + @JsonPropertyDescription("A string of characters specifying the delimiter that separates the key and value within " + + "a key-value pair. Special regular expression characters such as `[` and `]` must be escaped with `\\\\`. " + + "Cannot be defined at the same time as `key_value_delimiter_regex`. Optional. Default value is `=`.") private String valueSplitCharacters = DEFAULT_VALUE_SPLIT_CHARACTERS; @JsonProperty("non_match_value") + @JsonPropertyDescription("When a key-value pair cannot be successfully split, the key-value pair is " + + "placed in the `key` field, and the specified value is placed in the `value` field. " + + "Optional. Default value is `null`.") private Object nonMatchValue = DEFAULT_NON_MATCH_VALUE; + @JsonPropertyDescription("A prefix to append before all keys. Optional. Default value is an empty string.") @NotNull private String prefix = DEFAULT_PREFIX; @JsonProperty("delete_key_regex") + @JsonPropertyDescription("A regular expression specifying the characters to delete from the key. " + + "Special regular expression characters such as `[` and `]` must be escaped with `\\\\`. Cannot be an " + + "empty string. Optional. No default value.") @NotNull private String deleteKeyRegex = DEFAULT_DELETE_KEY_REGEX; @JsonProperty("delete_value_regex") + @JsonPropertyDescription("A regular expression specifying the characters to delete from the value. " + + "Special regular expression characters such as `[` and `]` must be escaped with `\\\\`. " + + "Cannot be an empty string. Optional. No default value.") @NotNull private String deleteValueRegex = DEFAULT_DELETE_VALUE_REGEX; @JsonProperty("transform_key") + @JsonPropertyDescription("When to lowercase, uppercase, or capitalize keys.") @NotNull private String transformKey = DEFAULT_TRANSFORM_KEY; @JsonProperty("whitespace") + @JsonPropertyDescription("Specifies whether to be lenient or strict with the acceptance of " + + "unnecessary white space surrounding the configured value-split sequence. Default is `lenient`.") @NotNull private String whitespace = DEFAULT_WHITESPACE; @JsonProperty("skip_duplicate_values") + @JsonPropertyDescription("A Boolean option for removing duplicate key-value pairs. When set to `true`, " + + "only one unique key-value pair will be preserved. Default is `false`.") @NotNull private boolean skipDuplicateValues = DEFAULT_SKIP_DUPLICATE_VALUES; @JsonProperty("remove_brackets") + @JsonPropertyDescription("Specifies whether to treat square brackets, angle brackets, and parentheses " + + "as value “wrappers” that should be removed from the value. Default is `false`.") @NotNull private boolean removeBrackets = DEFAULT_REMOVE_BRACKETS; @JsonProperty("value_grouping") + @JsonPropertyDescription("Specifies whether to group values using predefined value grouping delimiters: " + + "`{...}`, `[...]`, `<...>`, `(...)`, `\"...\"`, `'...'`, `http://... (space)`, and `https:// (space)`. " + + "If this flag is enabled, then the content between the delimiters is considered to be one entity and " + + "is not parsed for key-value pairs. Default is `false`. If `value_grouping` is `true`, then " + + "`{\"key1=[a=b,c=d]&key2=value2\"}` parses to `{\"key1\": \"[a=b,c=d]\", \"key2\": \"value2\"}`.") private boolean valueGrouping = DEFAULT_VALUE_GROUPING; @JsonProperty("recursive") + @JsonPropertyDescription("Specifies whether to recursively obtain additional key-value pairs from values. " + + "The extra key-value pairs will be stored as sub-keys of the root key. Default is `false`. " + + "The levels of recursive parsing must be defined by different brackets for each level: " + + "`[]`, `()`, and `<>`, in this order. Any other configurations specified will only be applied " + + "to the outmost keys.\n" + + "When `recursive` is `true`:\n" + + "`remove_brackets` cannot also be `true`;\n" + + "`skip_duplicate_values` will always be `true`;\n" + + "`whitespace` will always be `\"strict\"`.") @NotNull private boolean recursive = DEFAULT_RECURSIVE; @JsonProperty("tags_on_failure") + @JsonPropertyDescription("When a `kv` operation causes a runtime exception within the processor, " + + "the operation is safely stopped without crashing the processor, and the event is tagged " + + "with the provided tags.") private List tagsOnFailure; @JsonProperty("overwrite_if_destination_exists") + @JsonPropertyDescription("Specifies whether to overwrite existing fields if there are key conflicts " + + "when writing parsed fields to the event. Default is `true`.") private boolean overwriteIfDestinationExists = true; @JsonProperty("drop_keys_with_no_value") + @JsonPropertyDescription("Specifies whether keys should be dropped if they have a null value. Default is `false`. " + + "If `drop_keys_with_no_value` is set to `true`, " + + "then `{\"key1=value1&key2\"}` parses to `{\"key1\": \"value1\"}`.") private boolean dropKeysWithNoValue = false; @JsonProperty("key_value_when") + @JsonPropertyDescription("Allows you to specify a [conditional expression](https://opensearch.org/docs/latest/data-prepper/pipelines/expression-syntax/), " + + "such as `/some-key == \"test\"`, that will be evaluated to determine whether " + + "the processor should be applied to the event.") private String keyValueWhen; @JsonProperty("strict_grouping") + @JsonPropertyDescription("When enabled, groups with unmatched end characters yield errors. " + + "The event is ignored after the errors are logged. " + + "Specifies whether strict grouping should be enabled when the `value_grouping` " + + "or `string_literal_character` options are used. Default is `false`.") private boolean strictGrouping = false; @JsonProperty("string_literal_character") + @JsonPropertyDescription("When this option is used, any text contained within the specified quotation " + + "mark character will be ignored and excluded from key-value parsing. " + + "Can be set to either a single quotation mark (`'`) or a double quotation mark (`\"`). " + + "Default is `null`.") @Size(min = 0, max = 1, message = "string_literal_character may only have character") private String stringLiteralCharacter = null; @@ -124,7 +201,8 @@ boolean isValidValueGroupingAndFieldDelimiterRegex() { return (!valueGrouping || fieldDelimiterRegex == null); } - @AssertTrue(message = "Invalid Configuration. String literal character config is valid only when value_grouping is enabled, and only double quote (\") and single quote are (') are valid string literal characters.") + @AssertTrue(message = "Invalid Configuration. String literal character config is valid only when value_grouping is enabled, " + + "and only double quote (\") and single quote are (') are valid string literal characters.") boolean isValidStringLiteralConfig() { if (stringLiteralCharacter == null) return true; diff --git a/data-prepper-plugins/lambda-sink/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker b/data-prepper-plugins/lambda-sink/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker deleted file mode 100644 index 23c33feb6d..0000000000 --- a/data-prepper-plugins/lambda-sink/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker +++ /dev/null @@ -1,3 +0,0 @@ -# To enable mocking of final classes with vanilla Mockito -# https://github.com/mockito/mockito/wiki/What%27s-new-in-Mockito-2#mock-the-unmockable-opt-in-mocking-of-final-classesmethods -mock-maker-inline diff --git a/data-prepper-plugins/lambda-sink/README.md b/data-prepper-plugins/lambda/README.md similarity index 100% rename from data-prepper-plugins/lambda-sink/README.md rename to data-prepper-plugins/lambda/README.md diff --git a/data-prepper-plugins/lambda-sink/build.gradle b/data-prepper-plugins/lambda/build.gradle similarity index 87% rename from data-prepper-plugins/lambda-sink/build.gradle rename to data-prepper-plugins/lambda/build.gradle index 429e190a6a..8447c3abdf 100644 --- a/data-prepper-plugins/lambda-sink/build.gradle +++ b/data-prepper-plugins/lambda/build.gradle @@ -19,9 +19,15 @@ dependencies { implementation'org.json:json' implementation libs.commons.lang3 implementation 'com.fasterxml.jackson.datatype:jackson-datatype-jsr310' + implementation 'org.projectlombok:lombok:1.18.22' + compileOnly 'org.projectlombok:lombok:1.18.20' + annotationProcessor 'org.projectlombok:lombok:1.18.20' + testCompileOnly 'org.projectlombok:lombok:1.18.20' + testAnnotationProcessor 'org.projectlombok:lombok:1.18.20' testImplementation 'com.fasterxml.jackson.datatype:jackson-datatype-jsr310' testImplementation project(':data-prepper-test-common') testImplementation project(':data-prepper-plugins:parse-json-processor') + testImplementation testLibs.slf4j.simple } test { diff --git a/data-prepper-plugins/lambda-sink/src/integrationTest/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaSinkServiceIT.java b/data-prepper-plugins/lambda/src/integrationTest/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaSinkServiceIT.java similarity index 93% rename from data-prepper-plugins/lambda-sink/src/integrationTest/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaSinkServiceIT.java rename to data-prepper-plugins/lambda/src/integrationTest/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaSinkServiceIT.java index 89cf85ceac..76fb4831ce 100644 --- a/data-prepper-plugins/lambda-sink/src/integrationTest/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaSinkServiceIT.java +++ b/data-prepper-plugins/lambda/src/integrationTest/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaSinkServiceIT.java @@ -17,6 +17,7 @@ import org.mockito.Mock; import static org.mockito.Mockito.times; import static org.mockito.Mockito.verify; +import static org.mockito.Mockito.when; import org.mockito.MockitoAnnotations; import org.mockito.junit.jupiter.MockitoExtension; import org.opensearch.dataprepper.aws.api.AwsCredentialsSupplier; @@ -29,12 +30,14 @@ import org.opensearch.dataprepper.model.record.Record; import org.opensearch.dataprepper.model.sink.OutputCodecContext; import org.opensearch.dataprepper.model.types.ByteCount; -import org.opensearch.dataprepper.plugins.sink.lambda.accumlator.BufferFactory; -import org.opensearch.dataprepper.plugins.sink.lambda.accumlator.InMemoryBufferFactory; -import org.opensearch.dataprepper.plugins.sink.lambda.config.BatchOptions; -import org.opensearch.dataprepper.plugins.sink.lambda.config.ThresholdOptions; -import org.opensearch.dataprepper.plugins.sink.lambda.config.AwsAuthenticationOptions; -import org.opensearch.dataprepper.plugins.sink.lambda.dlq.DlqPushHandler; +import org.opensearch.dataprepper.plugins.lambda.common.accumlator.BufferFactory; +import org.opensearch.dataprepper.plugins.lambda.common.accumlator.InMemoryBufferFactory; +import org.opensearch.dataprepper.plugins.lambda.common.config.AwsAuthenticationOptions; +import org.opensearch.dataprepper.plugins.lambda.common.config.BatchOptions; +import org.opensearch.dataprepper.plugins.lambda.common.config.ThresholdOptions; +import org.opensearch.dataprepper.plugins.lambda.sink.LambdaSinkConfig; +import org.opensearch.dataprepper.plugins.lambda.sink.LambdaSinkService; +import org.opensearch.dataprepper.plugins.lambda.sink.dlq.DlqPushHandler; import software.amazon.awssdk.regions.Region; import software.amazon.awssdk.services.lambda.LambdaClient; @@ -45,8 +48,6 @@ import java.util.HashMap; import java.util.List; -import static org.mockito.Mockito.when; - @ExtendWith(MockitoExtension.class) class LambdaSinkServiceIT { diff --git a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/accumlator/Buffer.java b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/common/accumlator/Buffer.java similarity index 68% rename from data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/accumlator/Buffer.java rename to data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/common/accumlator/Buffer.java index 48afbe6a01..f52a8e5de0 100644 --- a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/accumlator/Buffer.java +++ b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/common/accumlator/Buffer.java @@ -3,9 +3,10 @@ * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.dataprepper.plugins.sink.lambda.accumlator; +package org.opensearch.dataprepper.plugins.lambda.common.accumlator; import software.amazon.awssdk.core.SdkBytes; +import software.amazon.awssdk.services.lambda.model.InvokeResponse; import java.io.OutputStream; import java.time.Duration; @@ -21,7 +22,9 @@ public interface Buffer { Duration getDuration(); - void flushToLambda(); + void flushToLambdaAsync(); + + InvokeResponse flushToLambdaSync(); OutputStream getOutputStream(); diff --git a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/accumlator/BufferFactory.java b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/common/accumlator/BufferFactory.java similarity index 82% rename from data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/accumlator/BufferFactory.java rename to data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/common/accumlator/BufferFactory.java index 80afd2f1ca..e44bbd6aee 100644 --- a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/accumlator/BufferFactory.java +++ b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/common/accumlator/BufferFactory.java @@ -3,7 +3,7 @@ * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.dataprepper.plugins.sink.lambda.accumlator; +package org.opensearch.dataprepper.plugins.lambda.common.accumlator; import software.amazon.awssdk.services.lambda.LambdaClient; diff --git a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/accumlator/InMemoryBuffer.java b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/common/accumlator/InMemoryBuffer.java similarity index 84% rename from data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/accumlator/InMemoryBuffer.java rename to data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/common/accumlator/InMemoryBuffer.java index bba70c6e62..5d9d5a5134 100644 --- a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/accumlator/InMemoryBuffer.java +++ b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/common/accumlator/InMemoryBuffer.java @@ -3,7 +3,7 @@ * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.dataprepper.plugins.sink.lambda.accumlator; +package org.opensearch.dataprepper.plugins.lambda.common.accumlator; import com.fasterxml.jackson.databind.JsonNode; import com.fasterxml.jackson.databind.ObjectMapper; @@ -62,7 +62,22 @@ public Duration getDuration() { @Override - public void flushToLambda() { + public void flushToLambdaAsync() { + InvokeResponse resp; + SdkBytes payload = getPayload(); + + // Setup an InvokeRequest. + InvokeRequest request = InvokeRequest.builder() + .functionName(functionName) + .payload(payload) + .invocationType(invocationType) + .build(); + + lambdaClient.invoke(request); + } + + @Override + public InvokeResponse flushToLambdaSync() { InvokeResponse resp; SdkBytes payload = getPayload(); @@ -74,6 +89,7 @@ public void flushToLambda() { .build(); resp = lambdaClient.invoke(request); + return resp; } private SdkBytes validatePayload(String payload_string) { diff --git a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/accumlator/InMemoryBufferFactory.java b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/common/accumlator/InMemoryBufferFactory.java similarity index 85% rename from data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/accumlator/InMemoryBufferFactory.java rename to data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/common/accumlator/InMemoryBufferFactory.java index e58952c5cb..37ad4a4105 100644 --- a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/accumlator/InMemoryBufferFactory.java +++ b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/common/accumlator/InMemoryBufferFactory.java @@ -3,7 +3,7 @@ * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.dataprepper.plugins.sink.lambda.accumlator; +package org.opensearch.dataprepper.plugins.lambda.common.accumlator; import software.amazon.awssdk.services.lambda.LambdaClient; diff --git a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/codec/LambdaJsonCodec.java b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/common/codec/LambdaJsonCodec.java similarity index 95% rename from data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/codec/LambdaJsonCodec.java rename to data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/common/codec/LambdaJsonCodec.java index 5bf21f5e18..a1ccaa8561 100644 --- a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/codec/LambdaJsonCodec.java +++ b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/common/codec/LambdaJsonCodec.java @@ -2,7 +2,7 @@ * Copyright OpenSearch Contributors * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.dataprepper.plugins.sink.lambda.codec; +package org.opensearch.dataprepper.plugins.lambda.common.codec; import com.fasterxml.jackson.core.JsonEncoding; import com.fasterxml.jackson.core.JsonFactory; @@ -37,7 +37,6 @@ public String getExtension() { @Override public void start(final OutputStream outputStream, Event event, final OutputCodecContext codecContext) throws IOException { Objects.requireNonNull(outputStream); - Objects.requireNonNull(codecContext); this.codecContext = codecContext; generator = factory.createGenerator(outputStream, JsonEncoding.UTF8); if(Objects.nonNull(keyName)){ diff --git a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/config/AwsAuthenticationOptions.java b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/common/config/AwsAuthenticationOptions.java similarity index 95% rename from data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/config/AwsAuthenticationOptions.java rename to data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/common/config/AwsAuthenticationOptions.java index 8d6c64829d..e40fa617ee 100644 --- a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/config/AwsAuthenticationOptions.java +++ b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/common/config/AwsAuthenticationOptions.java @@ -3,7 +3,7 @@ * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.dataprepper.plugins.sink.lambda.config; +package org.opensearch.dataprepper.plugins.lambda.common.config; import com.fasterxml.jackson.annotation.JsonProperty; import jakarta.validation.constraints.Size; diff --git a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/config/BatchOptions.java b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/common/config/BatchOptions.java similarity index 80% rename from data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/config/BatchOptions.java rename to data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/common/config/BatchOptions.java index 3773d4e6ed..099bed2b54 100644 --- a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/config/BatchOptions.java +++ b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/common/config/BatchOptions.java @@ -3,7 +3,7 @@ * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.dataprepper.plugins.sink.lambda.config; +package org.opensearch.dataprepper.plugins.lambda.common.config; import com.fasterxml.jackson.annotation.JsonProperty; import jakarta.validation.constraints.NotNull; @@ -18,7 +18,7 @@ public class BatchOptions { @JsonProperty("threshold") @NotNull - ThresholdOptions thresholdOptions; + ThresholdOptions thresholdOptions = new ThresholdOptions(); public String getBatchKey(){return batchKey;} diff --git a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/config/ThresholdOptions.java b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/common/config/ThresholdOptions.java similarity index 95% rename from data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/config/ThresholdOptions.java rename to data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/common/config/ThresholdOptions.java index 031157c4be..1f92b90b48 100644 --- a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/config/ThresholdOptions.java +++ b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/common/config/ThresholdOptions.java @@ -3,15 +3,16 @@ * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.dataprepper.plugins.sink.lambda.config; +package org.opensearch.dataprepper.plugins.lambda.common.config; import com.fasterxml.jackson.annotation.JsonProperty; +import jakarta.validation.constraints.NotNull; +import jakarta.validation.constraints.Size; import org.hibernate.validator.constraints.time.DurationMax; import org.hibernate.validator.constraints.time.DurationMin; import org.opensearch.dataprepper.model.types.ByteCount; + import java.time.Duration; -import jakarta.validation.constraints.NotNull; -import jakarta.validation.constraints.Size; public class ThresholdOptions { diff --git a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/ThresholdCheck.java b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/common/util/ThresholdCheck.java similarity index 84% rename from data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/ThresholdCheck.java rename to data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/common/util/ThresholdCheck.java index 74aa98e7f9..6bbf8a4ab8 100644 --- a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/ThresholdCheck.java +++ b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/common/util/ThresholdCheck.java @@ -3,10 +3,10 @@ * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.dataprepper.plugins.sink.lambda; +package org.opensearch.dataprepper.plugins.lambda.common.util; import org.opensearch.dataprepper.model.types.ByteCount; -import org.opensearch.dataprepper.plugins.sink.lambda.accumlator.Buffer; +import org.opensearch.dataprepper.plugins.lambda.common.accumlator.Buffer; import java.time.Duration; @@ -15,9 +15,6 @@ */ public class ThresholdCheck { - private ThresholdCheck() { - } - public static boolean checkThresholdExceed(final Buffer currentBuffer, final int maxEvents, final ByteCount maxBytes, final Duration maxCollectionDuration, final Boolean isBatchEnabled) { if (!isBatchEnabled) return true; diff --git a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaClientFactory.java b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/sink/LambdaClientFactory.java similarity index 93% rename from data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaClientFactory.java rename to data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/sink/LambdaClientFactory.java index 3e33a4e835..03b94340f0 100644 --- a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaClientFactory.java +++ b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/sink/LambdaClientFactory.java @@ -3,11 +3,11 @@ * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.dataprepper.plugins.sink.lambda; +package org.opensearch.dataprepper.plugins.lambda.sink; import org.opensearch.dataprepper.aws.api.AwsCredentialsOptions; import org.opensearch.dataprepper.aws.api.AwsCredentialsSupplier; -import org.opensearch.dataprepper.plugins.sink.lambda.config.AwsAuthenticationOptions; +import org.opensearch.dataprepper.plugins.lambda.common.config.AwsAuthenticationOptions; import software.amazon.awssdk.auth.credentials.AwsCredentialsProvider; import software.amazon.awssdk.core.client.config.ClientOverrideConfiguration; import software.amazon.awssdk.core.retry.RetryPolicy; diff --git a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaSink.java b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/sink/LambdaSink.java similarity index 93% rename from data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaSink.java rename to data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/sink/LambdaSink.java index b1ef905233..54e484fd13 100644 --- a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaSink.java +++ b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/sink/LambdaSink.java @@ -3,7 +3,7 @@ * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.dataprepper.plugins.sink.lambda; +package org.opensearch.dataprepper.plugins.lambda.sink; import org.opensearch.dataprepper.aws.api.AwsCredentialsSupplier; import org.opensearch.dataprepper.model.annotations.DataPrepperPlugin; @@ -17,9 +17,9 @@ import org.opensearch.dataprepper.model.plugin.PluginFactory; import org.opensearch.dataprepper.model.configuration.PluginSetting; import org.opensearch.dataprepper.model.sink.SinkContext; -import org.opensearch.dataprepper.plugins.sink.lambda.accumlator.BufferFactory; -import org.opensearch.dataprepper.plugins.sink.lambda.accumlator.InMemoryBufferFactory; -import org.opensearch.dataprepper.plugins.sink.lambda.dlq.DlqPushHandler; +import org.opensearch.dataprepper.plugins.lambda.common.accumlator.BufferFactory; +import org.opensearch.dataprepper.plugins.lambda.common.accumlator.InMemoryBufferFactory; +import org.opensearch.dataprepper.plugins.lambda.sink.dlq.DlqPushHandler; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import software.amazon.awssdk.services.lambda.LambdaClient; diff --git a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaSinkConfig.java b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/sink/LambdaSinkConfig.java similarity index 90% rename from data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaSinkConfig.java rename to data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/sink/LambdaSinkConfig.java index a20fa41181..bb50e2510e 100644 --- a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaSinkConfig.java +++ b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/sink/LambdaSinkConfig.java @@ -2,7 +2,7 @@ * Copyright OpenSearch Contributors * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.dataprepper.plugins.sink.lambda; +package org.opensearch.dataprepper.plugins.lambda.sink; import com.fasterxml.jackson.annotation.JsonProperty; import jakarta.validation.Valid; @@ -10,11 +10,11 @@ import jakarta.validation.constraints.NotNull; import jakarta.validation.constraints.Size; import org.opensearch.dataprepper.model.configuration.PluginModel; -import org.opensearch.dataprepper.plugins.sink.lambda.config.AwsAuthenticationOptions; -import org.opensearch.dataprepper.plugins.sink.lambda.config.BatchOptions; +import org.opensearch.dataprepper.plugins.lambda.common.config.AwsAuthenticationOptions; +import org.opensearch.dataprepper.plugins.lambda.common.config.BatchOptions; -import java.util.Objects; import java.util.Map; +import java.util.Objects; public class LambdaSinkConfig { diff --git a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaSinkService.java b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/sink/LambdaSinkService.java similarity index 92% rename from data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaSinkService.java rename to data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/sink/LambdaSinkService.java index f10607e7d1..9a788e6816 100644 --- a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaSinkService.java +++ b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/sink/LambdaSinkService.java @@ -3,29 +3,30 @@ * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.dataprepper.plugins.sink.lambda; +package org.opensearch.dataprepper.plugins.lambda.sink; import io.micrometer.core.instrument.Counter; +import org.opensearch.dataprepper.aws.api.AwsCredentialsSupplier; +import org.opensearch.dataprepper.metrics.PluginMetrics; import org.opensearch.dataprepper.model.codec.OutputCodec; import org.opensearch.dataprepper.model.configuration.PluginSetting; +import org.opensearch.dataprepper.model.event.Event; import org.opensearch.dataprepper.model.event.EventHandle; import org.opensearch.dataprepper.model.plugin.PluginFactory; +import org.opensearch.dataprepper.model.record.Record; import org.opensearch.dataprepper.model.sink.OutputCodecContext; import org.opensearch.dataprepper.model.types.ByteCount; -import org.opensearch.dataprepper.plugins.sink.lambda.accumlator.Buffer; -import org.opensearch.dataprepper.plugins.sink.lambda.accumlator.BufferFactory; -import org.opensearch.dataprepper.plugins.sink.lambda.codec.LambdaJsonCodec; -import org.opensearch.dataprepper.plugins.sink.lambda.config.BatchOptions; -import org.opensearch.dataprepper.aws.api.AwsCredentialsSupplier; -import org.opensearch.dataprepper.metrics.PluginMetrics; -import org.opensearch.dataprepper.model.event.Event; -import org.opensearch.dataprepper.model.record.Record; -import org.opensearch.dataprepper.plugins.sink.lambda.dlq.DlqPushHandler; -import org.opensearch.dataprepper.plugins.sink.lambda.dlq.LambdaSinkFailedDlqData; +import org.opensearch.dataprepper.plugins.lambda.common.accumlator.Buffer; +import org.opensearch.dataprepper.plugins.lambda.common.accumlator.BufferFactory; +import org.opensearch.dataprepper.plugins.lambda.common.codec.LambdaJsonCodec; +import org.opensearch.dataprepper.plugins.lambda.common.config.BatchOptions; +import org.opensearch.dataprepper.plugins.lambda.common.util.ThresholdCheck; +import org.opensearch.dataprepper.plugins.lambda.sink.dlq.DlqPushHandler; +import org.opensearch.dataprepper.plugins.lambda.sink.dlq.LambdaSinkFailedDlqData; import org.slf4j.Logger; import org.slf4j.LoggerFactory; -import software.amazon.awssdk.core.SdkBytes; import software.amazon.awssdk.awscore.exception.AwsServiceException; +import software.amazon.awssdk.core.SdkBytes; import software.amazon.awssdk.core.exception.SdkClientException; import software.amazon.awssdk.services.lambda.LambdaClient; @@ -48,7 +49,7 @@ public class LambdaSinkService { private final PluginSetting pluginSetting; private final Lock reentrantLock; private final LambdaSinkConfig lambdaSinkConfig; - private LambdaClient lambdaClient; + private final LambdaClient lambdaClient; private final String functionName; private int maxEvents = 0; private ByteCount maxBytes = null; @@ -65,9 +66,9 @@ public class LambdaSinkService { private final List events; private OutputCodec codec = null; private final BatchOptions batchOptions; - private Boolean isBatchEnabled; + private final Boolean isBatchEnabled; private OutputCodecContext codecContext = null; - private String batchKey; + private final String batchKey; public LambdaSinkService(final LambdaClient lambdaClient, final LambdaSinkConfig lambdaSinkConfig, @@ -213,7 +214,7 @@ protected boolean retryFlushToLambda(Buffer currentBuffer, do { try { - currentBuffer.flushToLambda(); + currentBuffer.flushToLambdaAsync(); isUploadedToLambda = Boolean.TRUE; } catch (AwsServiceException | SdkClientException e) { errorMsgObj.set(e.getMessage()); diff --git a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/dlq/DlqPushHandler.java b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/sink/dlq/DlqPushHandler.java similarity index 98% rename from data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/dlq/DlqPushHandler.java rename to data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/sink/dlq/DlqPushHandler.java index 1bdeb0a394..da8c52eb4e 100644 --- a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/dlq/DlqPushHandler.java +++ b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/sink/dlq/DlqPushHandler.java @@ -2,7 +2,7 @@ * Copyright OpenSearch Contributors * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.dataprepper.plugins.sink.lambda.dlq; +package org.opensearch.dataprepper.plugins.lambda.sink.dlq; import com.fasterxml.jackson.databind.ObjectWriter; import io.micrometer.core.instrument.util.StringUtils; diff --git a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/dlq/LambdaSinkFailedDlqData.java b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/sink/dlq/LambdaSinkFailedDlqData.java similarity index 95% rename from data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/dlq/LambdaSinkFailedDlqData.java rename to data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/sink/dlq/LambdaSinkFailedDlqData.java index 0808010e37..8941966b77 100644 --- a/data-prepper-plugins/lambda-sink/src/main/java/org/opensearch/dataprepper/plugins/sink/lambda/dlq/LambdaSinkFailedDlqData.java +++ b/data-prepper-plugins/lambda/src/main/java/org/opensearch/dataprepper/plugins/lambda/sink/dlq/LambdaSinkFailedDlqData.java @@ -2,7 +2,7 @@ * Copyright OpenSearch Contributors * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.dataprepper.plugins.sink.lambda.dlq; +package org.opensearch.dataprepper.plugins.lambda.sink.dlq; import com.fasterxml.jackson.core.JsonProcessingException; import software.amazon.awssdk.core.SdkBytes; diff --git a/data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/ThresholdCheckTest.java b/data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/common/ThresholdCheckTest.java similarity index 95% rename from data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/ThresholdCheckTest.java rename to data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/common/ThresholdCheckTest.java index b63553911a..d56420d18f 100644 --- a/data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/ThresholdCheckTest.java +++ b/data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/common/ThresholdCheckTest.java @@ -3,23 +3,23 @@ * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.dataprepper.plugins.sink.lambda; +package org.opensearch.dataprepper.plugins.lambda.common; +import static org.junit.jupiter.api.Assertions.assertFalse; +import static org.junit.jupiter.api.Assertions.assertTrue; import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; import org.mockito.Mock; +import static org.mockito.Mockito.when; import org.mockito.junit.jupiter.MockitoExtension; import org.opensearch.dataprepper.model.types.ByteCount; -import org.opensearch.dataprepper.plugins.sink.lambda.accumlator.Buffer; +import org.opensearch.dataprepper.plugins.lambda.common.accumlator.Buffer; +import org.opensearch.dataprepper.plugins.lambda.common.util.ThresholdCheck; import java.io.IOException; import java.time.Duration; -import static org.junit.jupiter.api.Assertions.assertFalse; -import static org.junit.jupiter.api.Assertions.assertTrue; -import static org.mockito.Mockito.when; - @ExtendWith(MockitoExtension.class) class ThresholdCheckTest { diff --git a/data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/accumulator/InMemoryBufferFactoryTest.java b/data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/common/accumulator/InMemoryBufferFactoryTest.java similarity index 78% rename from data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/accumulator/InMemoryBufferFactoryTest.java rename to data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/common/accumulator/InMemoryBufferFactoryTest.java index d161b28bb0..37276db819 100644 --- a/data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/accumulator/InMemoryBufferFactoryTest.java +++ b/data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/common/accumulator/InMemoryBufferFactoryTest.java @@ -3,12 +3,12 @@ * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.dataprepper.plugins.sink.lambda.accumulator; +package org.opensearch.dataprepper.plugins.lambda.common.accumulator; import org.junit.jupiter.api.Assertions; import org.junit.jupiter.api.Test; -import org.opensearch.dataprepper.plugins.sink.lambda.accumlator.Buffer; -import org.opensearch.dataprepper.plugins.sink.lambda.accumlator.InMemoryBufferFactory; +import org.opensearch.dataprepper.plugins.lambda.common.accumlator.Buffer; +import org.opensearch.dataprepper.plugins.lambda.common.accumlator.InMemoryBufferFactory; import static org.hamcrest.CoreMatchers.instanceOf; import static org.hamcrest.MatcherAssert.assertThat; diff --git a/data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/accumulator/InMemoryBufferTest.java b/data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/common/accumulator/InMemoryBufferTest.java similarity index 95% rename from data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/accumulator/InMemoryBufferTest.java rename to data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/common/accumulator/InMemoryBufferTest.java index 478650a300..fb164b1ac1 100644 --- a/data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/accumulator/InMemoryBufferTest.java +++ b/data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/common/accumulator/InMemoryBufferTest.java @@ -3,16 +3,26 @@ * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.dataprepper.plugins.sink.lambda.accumulator; +package org.opensearch.dataprepper.plugins.lambda.common.accumulator; +import static org.hamcrest.CoreMatchers.equalTo; +import static org.hamcrest.CoreMatchers.notNullValue; +import static org.hamcrest.MatcherAssert.assertThat; import org.hamcrest.Matchers; +import static org.hamcrest.Matchers.greaterThanOrEqualTo; +import static org.hamcrest.Matchers.lessThanOrEqualTo; import org.junit.jupiter.api.Assertions; +import static org.junit.jupiter.api.Assertions.assertDoesNotThrow; +import static org.junit.jupiter.api.Assertions.assertThrows; import org.junit.jupiter.api.Disabled; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; +import static org.mockito.ArgumentMatchers.any; import org.mockito.Mock; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.when; import org.mockito.junit.jupiter.MockitoExtension; -import org.opensearch.dataprepper.plugins.sink.lambda.accumlator.InMemoryBuffer; +import org.opensearch.dataprepper.plugins.lambda.common.accumlator.InMemoryBuffer; import software.amazon.awssdk.core.SdkBytes; import software.amazon.awssdk.core.exception.SdkClientException; import software.amazon.awssdk.services.lambda.LambdaClient; @@ -25,17 +35,6 @@ import java.time.Instant; import java.time.temporal.ChronoUnit; -import static org.hamcrest.CoreMatchers.equalTo; -import static org.hamcrest.CoreMatchers.notNullValue; -import static org.hamcrest.MatcherAssert.assertThat; -import static org.hamcrest.Matchers.greaterThanOrEqualTo; -import static org.hamcrest.Matchers.lessThanOrEqualTo; -import static org.junit.jupiter.api.Assertions.assertDoesNotThrow; -import static org.junit.jupiter.api.Assertions.assertThrows; -import static org.mockito.ArgumentMatchers.any; -import static org.mockito.Mockito.mock; -import static org.mockito.Mockito.when; - @ExtendWith(MockitoExtension.class) class InMemoryBufferTest { @@ -119,7 +118,7 @@ void test_with_write_event_into_buffer_and_flush_toLambda() throws IOException { inMemoryBuffer.setEventCount(eventCount); } assertDoesNotThrow(() -> { - inMemoryBuffer.flushToLambda(); + inMemoryBuffer.flushToLambdaAsync(); }); } @@ -136,7 +135,7 @@ void test_uploadedToLambda_success() throws IOException { OutputStream outputStream = inMemoryBuffer.getOutputStream(); outputStream.write(generateByteArray()); assertDoesNotThrow(() -> { - inMemoryBuffer.flushToLambda(); + inMemoryBuffer.flushToLambdaAsync(); }); } @@ -153,7 +152,7 @@ void test_uploadedToLambda_fails() { inMemoryBuffer = new InMemoryBuffer(lambdaClient, functionName, invocationType); Assertions.assertNotNull(inMemoryBuffer); - SdkClientException actualException = assertThrows(SdkClientException.class, () -> inMemoryBuffer.flushToLambda()); + SdkClientException actualException = assertThrows(SdkClientException.class, () -> inMemoryBuffer.flushToLambdaAsync()); assertThat(actualException, Matchers.equalTo(sdkClientException)); } diff --git a/data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/codec/LambdaJsonCodecTest.java b/data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/common/codec/LambdaJsonCodecTest.java similarity index 98% rename from data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/codec/LambdaJsonCodecTest.java rename to data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/common/codec/LambdaJsonCodecTest.java index 6de6ce8a0e..4b6e4c5caf 100644 --- a/data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/codec/LambdaJsonCodecTest.java +++ b/data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/common/codec/LambdaJsonCodecTest.java @@ -3,7 +3,7 @@ * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.dataprepper.plugins.sink.lambda.codec; +package org.opensearch.dataprepper.plugins.lambda.common.codec; import com.fasterxml.jackson.databind.JsonNode; import com.fasterxml.jackson.databind.ObjectMapper; diff --git a/data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/config/ThresholdOptionsTest.java b/data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/common/config/ThresholdOptionsTest.java similarity index 93% rename from data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/config/ThresholdOptionsTest.java rename to data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/common/config/ThresholdOptionsTest.java index 53bd0a4edf..5d12aca3da 100644 --- a/data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/config/ThresholdOptionsTest.java +++ b/data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/common/config/ThresholdOptionsTest.java @@ -3,13 +3,12 @@ * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.dataprepper.plugins.sink.lambda.config; - -import org.junit.jupiter.api.Test; -import org.opensearch.dataprepper.model.types.ByteCount; +package org.opensearch.dataprepper.plugins.lambda.common.config; import static org.hamcrest.CoreMatchers.equalTo; import static org.hamcrest.MatcherAssert.assertThat; +import org.junit.jupiter.api.Test; +import org.opensearch.dataprepper.model.types.ByteCount; class ThresholdOptionsTest { private static final String DEFAULT_BYTE_CAPACITY = "6mb"; diff --git a/data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaClientFactoryTest.java b/data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/sink/LambdaClientFactoryTest.java similarity index 96% rename from data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaClientFactoryTest.java rename to data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/sink/LambdaClientFactoryTest.java index ab72ee44b8..9ed5c71fb2 100644 --- a/data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaClientFactoryTest.java +++ b/data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/sink/LambdaClientFactoryTest.java @@ -2,35 +2,34 @@ * Copyright OpenSearch Contributors * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.dataprepper.plugins.sink.lambda; +package org.opensearch.dataprepper.plugins.lambda.sink; +import static org.hamcrest.CoreMatchers.notNullValue; +import static org.hamcrest.MatcherAssert.assertThat; +import static org.hamcrest.Matchers.equalTo; import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; import org.junit.jupiter.params.ParameterizedTest; import org.junit.jupiter.params.provider.ValueSource; import org.mockito.ArgumentCaptor; +import static org.mockito.ArgumentMatchers.any; import org.mockito.Mock; import org.mockito.MockedStatic; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.mockStatic; +import static org.mockito.Mockito.verify; +import static org.mockito.Mockito.when; import org.mockito.junit.jupiter.MockitoExtension; import org.opensearch.dataprepper.aws.api.AwsCredentialsOptions; import org.opensearch.dataprepper.aws.api.AwsCredentialsSupplier; -import org.opensearch.dataprepper.plugins.sink.lambda.config.AwsAuthenticationOptions; +import org.opensearch.dataprepper.plugins.lambda.common.config.AwsAuthenticationOptions; import software.amazon.awssdk.auth.credentials.AwsCredentialsProvider; import software.amazon.awssdk.core.client.config.ClientOverrideConfiguration; import software.amazon.awssdk.regions.Region; import software.amazon.awssdk.services.lambda.LambdaClient; import software.amazon.awssdk.services.lambda.LambdaClientBuilder; -import static org.hamcrest.Matchers.equalTo; -import static org.hamcrest.CoreMatchers.notNullValue; -import static org.hamcrest.MatcherAssert.assertThat; -import static org.mockito.ArgumentMatchers.any; -import static org.mockito.Mockito.mock; -import static org.mockito.Mockito.mockStatic; -import static org.mockito.Mockito.verify; -import static org.mockito.Mockito.when; - import java.util.Map; import java.util.UUID; diff --git a/data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaSinkConfigTest.java b/data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/sink/LambdaSinkConfigTest.java similarity index 94% rename from data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaSinkConfigTest.java rename to data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/sink/LambdaSinkConfigTest.java index eda9488a04..2a6dad3a69 100644 --- a/data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaSinkConfigTest.java +++ b/data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/sink/LambdaSinkConfigTest.java @@ -2,12 +2,13 @@ * Copyright OpenSearch Contributors * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.dataprepper.plugins.sink.lambda; +package org.opensearch.dataprepper.plugins.lambda.sink; import com.fasterxml.jackson.core.JsonProcessingException; import com.fasterxml.jackson.databind.ObjectMapper; import com.fasterxml.jackson.dataformat.yaml.YAMLFactory; import com.fasterxml.jackson.dataformat.yaml.YAMLGenerator; +import org.hamcrest.MatcherAssert; import org.junit.jupiter.api.Test; import software.amazon.awssdk.regions.Region; @@ -21,7 +22,7 @@ class LambdaSinkConfigTest { @Test void lambda_sink_default_max_connection_retries_test(){ - assertThat(new LambdaSinkConfig().getMaxConnectionRetries(),equalTo(DEFAULT_MAX_RETRIES)); + MatcherAssert.assertThat(new LambdaSinkConfig().getMaxConnectionRetries(),equalTo(DEFAULT_MAX_RETRIES)); } @Test diff --git a/data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaSinkServiceTest.java b/data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/sink/LambdaSinkServiceTest.java similarity index 94% rename from data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaSinkServiceTest.java rename to data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/sink/LambdaSinkServiceTest.java index bbab8778c0..f8ca0f11ec 100644 --- a/data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaSinkServiceTest.java +++ b/data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/sink/LambdaSinkServiceTest.java @@ -2,34 +2,46 @@ * Copyright OpenSearch Contributors * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.dataprepper.plugins.sink.lambda; +package org.opensearch.dataprepper.plugins.lambda.sink; import com.fasterxml.jackson.databind.ObjectMapper; import com.fasterxml.jackson.dataformat.yaml.YAMLFactory; import com.fasterxml.jackson.dataformat.yaml.YAMLGenerator; import io.micrometer.core.instrument.Counter; +import static org.hamcrest.MatcherAssert.assertThat; +import static org.hamcrest.Matchers.equalTo; +import static org.junit.jupiter.api.Assertions.assertEquals; import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Test; import org.mockito.ArgumentCaptor; +import static org.mockito.ArgumentMatchers.any; +import static org.mockito.BDDMockito.given; +import static org.mockito.Mockito.doNothing; +import static org.mockito.Mockito.doThrow; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.times; +import static org.mockito.Mockito.verify; +import static org.mockito.Mockito.verifyNoInteractions; +import static org.mockito.Mockito.when; +import org.opensearch.dataprepper.aws.api.AwsCredentialsSupplier; import org.opensearch.dataprepper.metrics.PluginMetrics; import org.opensearch.dataprepper.model.configuration.PluginSetting; import org.opensearch.dataprepper.model.event.Event; import org.opensearch.dataprepper.model.event.EventHandle; import org.opensearch.dataprepper.model.event.JacksonEvent; -import org.opensearch.dataprepper.model.record.Record; import org.opensearch.dataprepper.model.plugin.PluginFactory; +import org.opensearch.dataprepper.model.record.Record; import org.opensearch.dataprepper.model.sink.OutputCodecContext; import org.opensearch.dataprepper.model.types.ByteCount; -import org.opensearch.dataprepper.plugins.sink.lambda.accumlator.Buffer; -import org.opensearch.dataprepper.plugins.sink.lambda.accumlator.BufferFactory; -import org.opensearch.dataprepper.plugins.sink.lambda.accumlator.InMemoryBuffer; -import org.opensearch.dataprepper.plugins.sink.lambda.accumlator.InMemoryBufferFactory; -import org.opensearch.dataprepper.plugins.sink.lambda.config.AwsAuthenticationOptions; -import org.opensearch.dataprepper.plugins.sink.lambda.config.BatchOptions; -import org.opensearch.dataprepper.plugins.sink.lambda.config.ThresholdOptions; -import org.opensearch.dataprepper.plugins.sink.lambda.dlq.DlqPushHandler; -import org.opensearch.dataprepper.plugins.sink.lambda.dlq.LambdaSinkFailedDlqData; -import org.opensearch.dataprepper.aws.api.AwsCredentialsSupplier; +import org.opensearch.dataprepper.plugins.lambda.common.accumlator.Buffer; +import org.opensearch.dataprepper.plugins.lambda.common.accumlator.BufferFactory; +import org.opensearch.dataprepper.plugins.lambda.common.accumlator.InMemoryBuffer; +import org.opensearch.dataprepper.plugins.lambda.common.accumlator.InMemoryBufferFactory; +import org.opensearch.dataprepper.plugins.lambda.common.config.AwsAuthenticationOptions; +import org.opensearch.dataprepper.plugins.lambda.common.config.BatchOptions; +import org.opensearch.dataprepper.plugins.lambda.common.config.ThresholdOptions; +import org.opensearch.dataprepper.plugins.lambda.sink.dlq.DlqPushHandler; +import org.opensearch.dataprepper.plugins.lambda.sink.dlq.LambdaSinkFailedDlqData; import software.amazon.awssdk.awscore.exception.AwsServiceException; import software.amazon.awssdk.core.SdkBytes; import software.amazon.awssdk.http.SdkHttpResponse; @@ -37,7 +49,6 @@ import software.amazon.awssdk.services.lambda.model.InvokeRequest; import software.amazon.awssdk.services.lambda.model.InvokeResponse; - import java.io.ByteArrayOutputStream; import java.io.IOException; import java.time.Duration; @@ -47,19 +58,6 @@ import java.util.List; import java.util.Map; -import static org.mockito.Mockito.mock; -import static org.mockito.Mockito.when; -import static org.mockito.ArgumentMatchers.any; -import static org.hamcrest.MatcherAssert.assertThat; -import static org.hamcrest.Matchers.equalTo; -import static org.junit.jupiter.api.Assertions.assertEquals; -import static org.mockito.BDDMockito.given; -import static org.mockito.Mockito.doNothing; -import static org.mockito.Mockito.doThrow; -import static org.mockito.Mockito.times; -import static org.mockito.Mockito.verify; -import static org.mockito.Mockito.verifyNoInteractions; - public class LambdaSinkServiceTest { public static final int maxEvents = 10; @@ -193,7 +191,7 @@ public void lambda_sink_test_max_retires_works() throws IOException { ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream(); when(buffer.getOutputStream()).thenReturn(byteArrayOutputStream); when(bufferFactory.getBuffer(any(LambdaClient.class),any(),any())).thenReturn(buffer); - doThrow(AwsServiceException.class).when(buffer).flushToLambda(); + doThrow(AwsServiceException.class).when(buffer).flushToLambdaAsync(); LambdaSinkService lambdaSinkService = new LambdaSinkService(lambdaClient, lambdaSinkConfig, @@ -209,7 +207,7 @@ public void lambda_sink_test_max_retires_works() throws IOException { Collection> records = List.of(eventRecord); lambdaSinkService.output(records); - verify(buffer, times(3)).flushToLambda(); + verify(buffer, times(3)).flushToLambdaAsync(); } @Test @@ -232,7 +230,7 @@ public void lambda_sink_test_dlq_works() throws IOException { when(buffer.getOutputStream()).thenReturn(byteArrayOutputStream); when(bufferFactory.getBuffer(any(LambdaClient.class),any(),any())).thenReturn(buffer); - doThrow(AwsServiceException.class).when(buffer).flushToLambda(); + doThrow(AwsServiceException.class).when(buffer).flushToLambdaAsync(); LambdaSinkService lambdaSinkService = new LambdaSinkService(lambdaClient, lambdaSinkConfig, @@ -249,7 +247,7 @@ public void lambda_sink_test_dlq_works() throws IOException { lambdaSinkService.output(records); - verify(buffer, times(3)).flushToLambda(); + verify(buffer, times(3)).flushToLambdaAsync(); verify(dlqPushHandler,times(1)).perform(any(PluginSetting.class),any(Object.class)); } @@ -296,7 +294,7 @@ public void lambda_sink_test_batch_enabled() throws IOException { when(lambdaSinkConfig.getBatchOptions()).thenReturn(mock(BatchOptions.class)); when(lambdaSinkConfig.getBatchOptions().getBatchKey()).thenReturn(batchKey); when(lambdaSinkConfig.getBatchOptions().getThresholdOptions()).thenReturn(mock(ThresholdOptions.class)); - when(lambdaSinkConfig.getBatchOptions().getThresholdOptions().getEventCount()).thenReturn(maxEvents); + when(lambdaSinkConfig.getBatchOptions().getThresholdOptions().getEventCount()).thenReturn(1); when(lambdaSinkConfig.getBatchOptions().getThresholdOptions().getMaximumSize()).thenReturn(ByteCount.parse(maxSize)); when(lambdaSinkConfig.getBatchOptions().getThresholdOptions().getEventCollectTimeOut()).thenReturn(Duration.ofNanos(10L)); when(lambdaSinkConfig.getAwsAuthenticationOptions()).thenReturn(mock(AwsAuthenticationOptions.class)); diff --git a/data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaSinkTest.java b/data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/sink/LambdaSinkTest.java similarity index 95% rename from data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaSinkTest.java rename to data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/sink/LambdaSinkTest.java index 1687cbd285..9a042014f0 100644 --- a/data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/LambdaSinkTest.java +++ b/data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/sink/LambdaSinkTest.java @@ -3,28 +3,27 @@ * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.dataprepper.plugins.sink.lambda; +package org.opensearch.dataprepper.plugins.lambda.sink; import org.junit.jupiter.api.Assertions; +import static org.junit.jupiter.api.Assertions.assertFalse; +import static org.junit.jupiter.api.Assertions.assertTrue; import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Test; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.when; import org.opensearch.dataprepper.aws.api.AwsCredentialsSupplier; import org.opensearch.dataprepper.model.configuration.PluginModel; import org.opensearch.dataprepper.model.configuration.PluginSetting; import org.opensearch.dataprepper.model.plugin.PluginFactory; import org.opensearch.dataprepper.model.sink.SinkContext; -import org.opensearch.dataprepper.plugins.sink.lambda.config.AwsAuthenticationOptions; +import org.opensearch.dataprepper.plugins.lambda.common.config.AwsAuthenticationOptions; import software.amazon.awssdk.regions.Region; import software.amazon.awssdk.services.lambda.LambdaClient; import java.util.HashMap; import java.util.Map; -import static org.junit.jupiter.api.Assertions.assertFalse; -import static org.junit.jupiter.api.Assertions.assertTrue; -import static org.mockito.Mockito.mock; -import static org.mockito.Mockito.when; - class LambdaSinkTest { public static final String S3_REGION = "us-east-1"; diff --git a/data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/dlq/DlqPushHandlerTest.java b/data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/sink/dlq/DlqPushHandlerTest.java similarity index 95% rename from data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/dlq/DlqPushHandlerTest.java rename to data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/sink/dlq/DlqPushHandlerTest.java index 17f39973b7..e1de3303a1 100644 --- a/data-prepper-plugins/lambda-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/lambda/dlq/DlqPushHandlerTest.java +++ b/data-prepper-plugins/lambda/src/test/java/org/opensearch/dataprepper/plugins/lambda/sink/dlq/DlqPushHandlerTest.java @@ -2,17 +2,24 @@ * Copyright OpenSearch Contributors * SPDX-License-Identifier: Apache-2.0 */ -package org.opensearch.dataprepper.plugins.sink.lambda.dlq; +package org.opensearch.dataprepper.plugins.lambda.sink.dlq; import org.junit.jupiter.api.Assertions; import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Test; +import static org.mockito.ArgumentMatchers.any; +import static org.mockito.ArgumentMatchers.anyList; +import static org.mockito.ArgumentMatchers.anyString; +import static org.mockito.Mockito.doNothing; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.verify; +import static org.mockito.Mockito.when; import org.opensearch.dataprepper.model.configuration.PluginModel; import org.opensearch.dataprepper.model.configuration.PluginSetting; import org.opensearch.dataprepper.model.plugin.PluginFactory; +import org.opensearch.dataprepper.plugins.lambda.common.config.AwsAuthenticationOptions; import org.opensearch.dataprepper.plugins.dlq.DlqProvider; import org.opensearch.dataprepper.plugins.dlq.DlqWriter; -import org.opensearch.dataprepper.plugins.sink.lambda.config.AwsAuthenticationOptions; import software.amazon.awssdk.core.SdkBytes; import java.io.IOException; @@ -20,14 +27,6 @@ import java.util.Map; import java.util.Optional; -import static org.mockito.ArgumentMatchers.any; -import static org.mockito.ArgumentMatchers.anyList; -import static org.mockito.ArgumentMatchers.anyString; -import static org.mockito.Mockito.doNothing; -import static org.mockito.Mockito.mock; -import static org.mockito.Mockito.verify; -import static org.mockito.Mockito.when; - class DlqPushHandlerTest { private static final String BUCKET = "bucket"; diff --git a/data-prepper-plugins/lambda/src/test/resources/simplelogger.properties b/data-prepper-plugins/lambda/src/test/resources/simplelogger.properties new file mode 100644 index 0000000000..f464558cf4 --- /dev/null +++ b/data-prepper-plugins/lambda/src/test/resources/simplelogger.properties @@ -0,0 +1,8 @@ +# +# Copyright OpenSearch Contributors +# SPDX-License-Identifier: Apache-2.0 +# + +org.slf4j.simpleLogger.showDateTime=true +org.slf4j.simpleLogger.dateTimeFormat=yyyy-MM-dd' 'HH:mm:ss.SSS +org.slf4j.simpleLogger.log.org.opensearch.dataprepper.plugins.lambda.sink=trace diff --git a/data-prepper-plugins/mongodb/build.gradle b/data-prepper-plugins/mongodb/build.gradle index ae4a5a9d45..c5495880e6 100644 --- a/data-prepper-plugins/mongodb/build.gradle +++ b/data-prepper-plugins/mongodb/build.gradle @@ -16,7 +16,6 @@ dependencies { implementation project(path: ':data-prepper-plugins:common') - testImplementation testLibs.mockito.inline testImplementation testLibs.bundles.junit testImplementation testLibs.slf4j.simple testImplementation project(path: ':data-prepper-test-common') diff --git a/data-prepper-plugins/mongodb/src/main/java/org/opensearch/dataprepper/plugins/mongo/documentdb/DocumentDBService.java b/data-prepper-plugins/mongodb/src/main/java/org/opensearch/dataprepper/plugins/mongo/documentdb/DocumentDBService.java index 73567b8605..1acf21620b 100644 --- a/data-prepper-plugins/mongodb/src/main/java/org/opensearch/dataprepper/plugins/mongo/documentdb/DocumentDBService.java +++ b/data-prepper-plugins/mongodb/src/main/java/org/opensearch/dataprepper/plugins/mongo/documentdb/DocumentDBService.java @@ -33,6 +33,7 @@ public class DocumentDBService { private final PluginConfigObservable pluginConfigObservable; private final DocumentDBSourceAggregateMetrics documentDBAggregateMetrics; private ExecutorService leaderExecutor; + private MongoTasksRefresher mongoTasksRefresher; public DocumentDBService(final EnhancedSourceCoordinator sourceCoordinator, final MongoDBSourceConfig sourceConfig, final PluginMetrics pluginMetrics, @@ -68,7 +69,7 @@ public void start(Buffer> buffer) { BackgroundThreadFactory.defaultExecutorThreadFactory("documentdb-source")); runnableList.forEach(leaderExecutor::submit); - final MongoTasksRefresher mongoTasksRefresher = new MongoTasksRefresher( + mongoTasksRefresher = new MongoTasksRefresher( buffer, sourceCoordinator, pluginMetrics, acknowledgementSetManager, numThread -> Executors.newFixedThreadPool( numThread, BackgroundThreadFactory.defaultExecutorThreadFactory("documentdb-source")), @@ -105,5 +106,10 @@ public void shutdown() { LOG.info("shutdown DocumentDB Service scheduler and worker"); leaderExecutor.shutdownNow(); } + + if (mongoTasksRefresher != null) { + LOG.info("shutdown DocumentDB Task refresher"); + mongoTasksRefresher.shutdown(); + } } } diff --git a/data-prepper-plugins/mongodb/src/main/java/org/opensearch/dataprepper/plugins/mongo/documentdb/MongoTasksRefresher.java b/data-prepper-plugins/mongodb/src/main/java/org/opensearch/dataprepper/plugins/mongo/documentdb/MongoTasksRefresher.java index 19c988f285..3fea4680e8 100644 --- a/data-prepper-plugins/mongodb/src/main/java/org/opensearch/dataprepper/plugins/mongo/documentdb/MongoTasksRefresher.java +++ b/data-prepper-plugins/mongodb/src/main/java/org/opensearch/dataprepper/plugins/mongo/documentdb/MongoTasksRefresher.java @@ -91,7 +91,7 @@ public void update(MongoDBSourceConfig pluginConfig) { private void refreshJobs(MongoDBSourceConfig pluginConfig) { final List runnables = new ArrayList<>(); if (pluginConfig.getCollections().stream().anyMatch(CollectionConfig::isExport)) { - currentMongoDBExportPartitionSupplier = new MongoDBExportPartitionSupplier(pluginConfig, documentDBAggregateMetrics); + currentMongoDBExportPartitionSupplier = new MongoDBExportPartitionSupplier(pluginConfig, sourceCoordinator, documentDBAggregateMetrics); runnables.add(new ExportScheduler(sourceCoordinator, currentMongoDBExportPartitionSupplier, pluginMetrics)); runnables.add(new ExportWorker( sourceCoordinator, buffer, pluginMetrics, acknowledgementSetManager, pluginConfig, s3PathPrefix, documentDBAggregateMetrics)); @@ -110,4 +110,15 @@ private boolean basicAuthChanged(final MongoDBSourceConfig.AuthenticationConfig return !Objects.equals(currentAuthConfig.getUsername(), newAuthConfig.getUsername()) || !Objects.equals(currentAuthConfig.getPassword(), newAuthConfig.getPassword()); } + + /** + * Interrupt the running of schedulers. + * Each scheduler must implement logic for gracefully shutdown. + */ + public void shutdown() { + if (currentExecutor != null) { + LOG.info("shutdown down export worker and stream worker"); + currentExecutor.shutdownNow(); + } + } } diff --git a/data-prepper-plugins/mongodb/src/main/java/org/opensearch/dataprepper/plugins/mongo/export/MongoDBExportPartitionSupplier.java b/data-prepper-plugins/mongodb/src/main/java/org/opensearch/dataprepper/plugins/mongo/export/MongoDBExportPartitionSupplier.java index d3fb3aac3c..dfbf518318 100644 --- a/data-prepper-plugins/mongodb/src/main/java/org/opensearch/dataprepper/plugins/mongo/export/MongoDBExportPartitionSupplier.java +++ b/data-prepper-plugins/mongodb/src/main/java/org/opensearch/dataprepper/plugins/mongo/export/MongoDBExportPartitionSupplier.java @@ -14,6 +14,7 @@ import com.mongodb.client.model.Filters; import org.bson.Document; import org.opensearch.dataprepper.model.source.coordinator.PartitionIdentifier; +import org.opensearch.dataprepper.model.source.coordinator.enhanced.EnhancedSourceCoordinator; import org.opensearch.dataprepper.plugins.mongo.client.BsonHelper; import org.opensearch.dataprepper.plugins.mongo.client.MongoDBConnection; import org.opensearch.dataprepper.plugins.mongo.configuration.MongoDBSourceConfig; @@ -38,11 +39,14 @@ public class MongoDBExportPartitionSupplier implements Function= checkPointIntervalInMs) { long ackCount = 0; do { @@ -69,17 +69,16 @@ private void monitorAcknowledgment(final ExecutorService executorService, final if (ackCount % CHECKPOINT_RECORD_INTERVAL == 0) { checkpoint(lastCheckpointStatus.getResumeToken(), lastCheckpointStatus.getRecordCount()); } - } while (checkpointStatus != null && checkpointStatus.isAcknowledged()); + } while (checkpointStatus != null && checkpointStatus.isPositiveAcknowledgement()); checkpoint(lastCheckpointStatus.getResumeToken(), lastCheckpointStatus.getRecordCount()); lastCheckpointTime = System.currentTimeMillis(); } } else { LOG.debug("Checkpoint not complete for resume token {}", checkpointStatus.getResumeToken()); final Duration ackWaitDuration = Duration.between(Instant.ofEpochMilli(checkpointStatus.getCreateTimestamp()), Instant.now()); - // Acknowledgement not received for the checkpoint after twice ack wait time - if (ackWaitDuration.getSeconds() >= partitionAcknowledgmentTimeout.getSeconds() * 2) { + if (checkpointStatus.isNegativeAcknowledgement()) { // Give up partition and should interrupt parent thread to stop processing stream - if (lastCheckpointStatus != null && lastCheckpointStatus.isAcknowledged()) { + if (lastCheckpointStatus != null && lastCheckpointStatus.isPositiveAcknowledgement()) { partitionCheckpoint.checkpoint(lastCheckpointStatus.getResumeToken(), lastCheckpointStatus.getRecordCount()); } LOG.warn("Acknowledgement not received for the checkpoint {} past wait time. Giving up partition.", checkpointStatus.getResumeToken()); @@ -124,12 +123,13 @@ Optional createAcknowledgementSet(final String resumeToken, ackStatus.put(resumeToken, checkpointStatus); LOG.debug("Creating acknowledgment for resumeToken {}", checkpointStatus.getResumeToken()); return Optional.of(acknowledgementSetManager.create((result) -> { + final CheckpointStatus ackCheckpointStatus = ackStatus.get(resumeToken); + ackCheckpointStatus.setAcknowledgedTimestamp(Instant.now().toEpochMilli()); if (result) { - final CheckpointStatus ackCheckpointStatus = ackStatus.get(resumeToken); - ackCheckpointStatus.setAcknowledgedTimestamp(Instant.now().toEpochMilli()); - ackCheckpointStatus.setAcknowledged(true); + ackCheckpointStatus.setAcknowledged(CheckpointStatus.AcknowledgmentStatus.POSITIVE_ACK); LOG.debug("Received acknowledgment of completion from sink for checkpoint {}", resumeToken); } else { + ackCheckpointStatus.setAcknowledged(CheckpointStatus.AcknowledgmentStatus.NEGATIVE_ACK); LOG.warn("Negative acknowledgment received for checkpoint {}, resetting checkpoint", resumeToken); // default CheckpointStatus acknowledged value is false. The monitorCheckpoints method will time out // and reprocess stream from last successful checkpoint in the order. diff --git a/data-prepper-plugins/mongodb/src/main/java/org/opensearch/dataprepper/plugins/mongo/utils/DocumentDBSourceAggregateMetrics.java b/data-prepper-plugins/mongodb/src/main/java/org/opensearch/dataprepper/plugins/mongo/utils/DocumentDBSourceAggregateMetrics.java index db85260a52..c5a09a45e7 100644 --- a/data-prepper-plugins/mongodb/src/main/java/org/opensearch/dataprepper/plugins/mongo/utils/DocumentDBSourceAggregateMetrics.java +++ b/data-prepper-plugins/mongodb/src/main/java/org/opensearch/dataprepper/plugins/mongo/utils/DocumentDBSourceAggregateMetrics.java @@ -17,8 +17,7 @@ public class DocumentDBSourceAggregateMetrics { private static final String DOCUMENT_DB_EXPORT_5XX_ERRORS = "export5xxErrors"; private static final String DOCUMENT_DB_EXPORT_4XX_ERRORS = "export4xxErrors"; private static final String DOCUMENT_DB_EXPORT_API_INVOCATIONS = "exportApiInvocations"; - - + private static final String DOCUMENT_DB_EXPORT_PARTITION_QUERY_COUNT = "exportPartitionQueryCount"; private final PluginMetrics pluginMetrics; @@ -28,6 +27,7 @@ public class DocumentDBSourceAggregateMetrics { private final Counter export5xxErrors; private final Counter export4xxErrors; private final Counter exportApiInvocations; + private final Counter exportPartitionQueryCount; public DocumentDBSourceAggregateMetrics() { this.pluginMetrics = PluginMetrics.fromPrefix(DOCUMENT_DB); @@ -37,6 +37,7 @@ public DocumentDBSourceAggregateMetrics() { this.export5xxErrors = pluginMetrics.counter(DOCUMENT_DB_EXPORT_5XX_ERRORS); this.export4xxErrors = pluginMetrics.counter(DOCUMENT_DB_EXPORT_4XX_ERRORS); this.exportApiInvocations = pluginMetrics.counter(DOCUMENT_DB_EXPORT_API_INVOCATIONS); + this.exportPartitionQueryCount = pluginMetrics.counter(DOCUMENT_DB_EXPORT_PARTITION_QUERY_COUNT); } public Counter getStream5xxErrors() { @@ -62,4 +63,8 @@ public Counter getExport4xxErrors() { public Counter getExportApiInvocations() { return exportApiInvocations; } + + public Counter getExportPartitionQueryCount() { + return exportPartitionQueryCount; + } } diff --git a/data-prepper-plugins/mongodb/src/test/java/org/opensearch/dataprepper/plugins/mongo/documentdb/MongoTasksRefresherTest.java b/data-prepper-plugins/mongodb/src/test/java/org/opensearch/dataprepper/plugins/mongo/documentdb/MongoTasksRefresherTest.java index b48e097cc0..9ce93c8ded 100644 --- a/data-prepper-plugins/mongodb/src/test/java/org/opensearch/dataprepper/plugins/mongo/documentdb/MongoTasksRefresherTest.java +++ b/data-prepper-plugins/mongodb/src/test/java/org/opensearch/dataprepper/plugins/mongo/documentdb/MongoTasksRefresherTest.java @@ -254,4 +254,18 @@ void testTaskRefreshWithNullS3PathPrefix() { buffer, enhancedSourceCoordinator, pluginMetrics, acknowledgementSetManager, executorServiceFunction, null, documentDBSourceAggregateMetrics)); } + + @Test + void testTaskRefreshShutdown() { + final MongoTasksRefresher objectUnderTest = createObjectUnderTest(); + objectUnderTest.initialize(sourceConfig); + objectUnderTest.shutdown(); + verify(executorServiceFunction).apply(eq(3)); + verify(executorService).submit(any(ExportScheduler.class)); + verify(executorService).submit(any(ExportWorker.class)); + verify(executorService).submit(any(StreamScheduler.class)); + verify(executorService).shutdownNow(); + verifyNoMoreInteractions(executorServiceFunction); + + } } \ No newline at end of file diff --git a/data-prepper-plugins/mongodb/src/test/java/org/opensearch/dataprepper/plugins/mongo/export/MongoDBExportPartitionSupplierTest.java b/data-prepper-plugins/mongodb/src/test/java/org/opensearch/dataprepper/plugins/mongo/export/MongoDBExportPartitionSupplierTest.java index e8307e2d6d..0329cf7b72 100644 --- a/data-prepper-plugins/mongodb/src/test/java/org/opensearch/dataprepper/plugins/mongo/export/MongoDBExportPartitionSupplierTest.java +++ b/data-prepper-plugins/mongodb/src/test/java/org/opensearch/dataprepper/plugins/mongo/export/MongoDBExportPartitionSupplierTest.java @@ -21,6 +21,7 @@ import org.mockito.MockedStatic; import org.mockito.junit.jupiter.MockitoExtension; import org.opensearch.dataprepper.model.source.coordinator.PartitionIdentifier; +import org.opensearch.dataprepper.model.source.coordinator.enhanced.EnhancedSourceCoordinator; import org.opensearch.dataprepper.plugins.mongo.client.MongoDBConnection; import org.opensearch.dataprepper.plugins.mongo.configuration.CollectionConfig; import org.opensearch.dataprepper.plugins.mongo.configuration.MongoDBSourceConfig; @@ -53,6 +54,9 @@ public class MongoDBExportPartitionSupplierTest { @Mock private MongoDBSourceConfig mongoDBConfig; + @Mock + private EnhancedSourceCoordinator sourceCoordinator; + @Mock private DocumentDBSourceAggregateMetrics documentDBSourceAggregateMetrics; @@ -65,6 +69,8 @@ public class MongoDBExportPartitionSupplierTest { @Mock private Counter exportApiInvocations; @Mock + private Counter exportPartitionQueryCount; + @Mock private Counter export4xxErrors; @Mock private Counter export5xxErrors; @@ -77,9 +83,10 @@ public void setup() { lenient().when(collectionConfig.getCollectionName()).thenReturn(TEST_COLLECTION_NAME); lenient().when(mongoDBConfig.getCollections()).thenReturn(Collections.singletonList(collectionConfig)); when(documentDBSourceAggregateMetrics.getExportApiInvocations()).thenReturn(exportApiInvocations); + lenient().when(documentDBSourceAggregateMetrics.getExportPartitionQueryCount()).thenReturn(exportPartitionQueryCount); lenient().when(documentDBSourceAggregateMetrics.getExport4xxErrors()).thenReturn(export4xxErrors); lenient().when(documentDBSourceAggregateMetrics.getExport5xxErrors()).thenReturn(export5xxErrors); - testSupplier = new MongoDBExportPartitionSupplier(mongoDBConfig, documentDBSourceAggregateMetrics); + testSupplier = new MongoDBExportPartitionSupplier(mongoDBConfig, sourceCoordinator, documentDBSourceAggregateMetrics); } @Test @@ -121,6 +128,7 @@ public void test_buildPartitionsCollection() { verify(mongoClient, times(1)).close(); verify(mongoDatabase).getCollection(eq("collection")); verify(exportApiInvocations).increment(); + verify(exportPartitionQueryCount, times(2)).increment(); verify(export4xxErrors, never()).increment(); verify(export5xxErrors, never()).increment(); // And partitions are created @@ -135,6 +143,7 @@ public void test_buildPartitionsForCollection_error() { when(exportPartition.getCollection()).thenReturn("invalidDBName"); assertThrows(IllegalArgumentException.class, () -> testSupplier.apply(exportPartition)); verify(exportApiInvocations).increment(); + verify(exportPartitionQueryCount, never()).increment(); verify(export4xxErrors).increment(); verify(export5xxErrors, never()).increment(); } @@ -146,6 +155,7 @@ public void test_buildPartitions_dbException() { .thenThrow(MongoClientException.class); assertThrows(RuntimeException.class, () -> testSupplier.apply(exportPartition)); verify(exportApiInvocations).increment(); + verify(exportPartitionQueryCount, never()).increment(); verify(export4xxErrors).increment(); verify(export5xxErrors, never()).increment(); } diff --git a/data-prepper-plugins/mongodb/src/test/java/org/opensearch/dataprepper/plugins/mongo/stream/StreamAcknowledgementManagerTest.java b/data-prepper-plugins/mongodb/src/test/java/org/opensearch/dataprepper/plugins/mongo/stream/StreamAcknowledgementManagerTest.java index 78e2a51503..4e41008627 100644 --- a/data-prepper-plugins/mongodb/src/test/java/org/opensearch/dataprepper/plugins/mongo/stream/StreamAcknowledgementManagerTest.java +++ b/data-prepper-plugins/mongodb/src/test/java/org/opensearch/dataprepper/plugins/mongo/stream/StreamAcknowledgementManagerTest.java @@ -73,7 +73,7 @@ public void createAcknowledgementSet_enabled_ackSetWithAck() { consumer.accept(true); final ConcurrentHashMap ackStatus = streamAckManager.getAcknowledgementStatus(); final CheckpointStatus ackCheckpointStatus = ackStatus.get(resumeToken); - assertThat(ackCheckpointStatus.isAcknowledged(), is(true)); + assertThat(ackCheckpointStatus.isPositiveAcknowledgement(), is(true)); await() .atMost(Duration.ofSeconds(10)).untilAsserted(() -> verify(partitionCheckpoint).checkpoint(resumeToken, recordCount)); @@ -109,7 +109,7 @@ public void createAcknowledgementSet_enabled_multipleAckSetWithAck() { consumers.get(1).accept(true); ConcurrentHashMap ackStatus = streamAckManager.getAcknowledgementStatus(); CheckpointStatus ackCheckpointStatus = ackStatus.get(resumeToken2); - assertThat(ackCheckpointStatus.isAcknowledged(), is(true)); + assertThat(ackCheckpointStatus.isPositiveAcknowledgement(), is(true)); await() .atMost(Duration.ofSeconds(10)).untilAsserted(() -> verify(partitionCheckpoint).checkpoint(resumeToken2, recordCount2)); @@ -143,7 +143,7 @@ public void createAcknowledgementSet_enabled_multipleAckSetWithAckFailure() { consumers.get(1).accept(true); ConcurrentHashMap ackStatus = streamAckManager.getAcknowledgementStatus(); CheckpointStatus ackCheckpointStatus = ackStatus.get(resumeToken2); - assertThat(ackCheckpointStatus.isAcknowledged(), is(true)); + assertThat(ackCheckpointStatus.isPositiveAcknowledgement(), is(true)); await() .atMost(Duration.ofSeconds(10)).untilAsserted(() -> verify(partitionCheckpoint).giveUpPartition()); @@ -169,7 +169,7 @@ public void createAcknowledgementSet_enabled_ackSetWithNoAck() { consumer.accept(false); final ConcurrentHashMap ackStatus = streamAckManager.getAcknowledgementStatus(); final CheckpointStatus ackCheckpointStatus = ackStatus.get(resumeToken); - assertThat(ackCheckpointStatus.isAcknowledged(), is(false)); + assertThat(ackCheckpointStatus.isPositiveAcknowledgement(), is(false)); await() .atMost(Duration.ofSeconds(10)).untilAsserted(() -> verify(stopWorkerConsumer).accept(null)); diff --git a/data-prepper-plugins/mutate-event-processors/build.gradle b/data-prepper-plugins/mutate-event-processors/build.gradle index 3fbbc37254..e4b0c63cea 100644 --- a/data-prepper-plugins/mutate-event-processors/build.gradle +++ b/data-prepper-plugins/mutate-event-processors/build.gradle @@ -22,4 +22,6 @@ dependencies { implementation project(':data-prepper-api') implementation project(':data-prepper-plugins:common') implementation 'com.fasterxml.jackson.core:jackson-databind' + testImplementation project(':data-prepper-test-event') + testImplementation testLibs.slf4j.simple } \ No newline at end of file diff --git a/data-prepper-plugins/mutate-event-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutateevent/DeleteEntryProcessor.java b/data-prepper-plugins/mutate-event-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutateevent/DeleteEntryProcessor.java index d7c902a32c..cfadf70d03 100644 --- a/data-prepper-plugins/mutate-event-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutateevent/DeleteEntryProcessor.java +++ b/data-prepper-plugins/mutate-event-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutateevent/DeleteEntryProcessor.java @@ -10,6 +10,7 @@ import org.opensearch.dataprepper.model.annotations.DataPrepperPlugin; import org.opensearch.dataprepper.model.annotations.DataPrepperPluginConstructor; import org.opensearch.dataprepper.model.event.Event; +import org.opensearch.dataprepper.model.event.EventKey; import org.opensearch.dataprepper.model.processor.AbstractProcessor; import org.opensearch.dataprepper.model.processor.Processor; import org.opensearch.dataprepper.model.record.Record; @@ -17,6 +18,7 @@ import org.slf4j.LoggerFactory; import java.util.Collection; +import java.util.List; import java.util.Objects; import static org.opensearch.dataprepper.logging.DataPrepperMarkers.EVENT; @@ -25,7 +27,7 @@ public class DeleteEntryProcessor extends AbstractProcessor, Record> { private static final Logger LOG = LoggerFactory.getLogger(DeleteEntryProcessor.class); - private final String[] entries; + private final List entries; private final String deleteWhen; private final ExpressionEvaluator expressionEvaluator; @@ -49,7 +51,7 @@ public Collection> doExecute(final Collection> recor } - for (String entry : entries) { + for (final EventKey entry : entries) { recordEvent.delete(entry); } } catch (final Exception e) { diff --git a/data-prepper-plugins/mutate-event-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutateevent/DeleteEntryProcessorConfig.java b/data-prepper-plugins/mutate-event-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutateevent/DeleteEntryProcessorConfig.java index 8470576a7b..b1df976770 100644 --- a/data-prepper-plugins/mutate-event-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutateevent/DeleteEntryProcessorConfig.java +++ b/data-prepper-plugins/mutate-event-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutateevent/DeleteEntryProcessorConfig.java @@ -6,19 +6,29 @@ package org.opensearch.dataprepper.plugins.processor.mutateevent; import com.fasterxml.jackson.annotation.JsonProperty; +import com.fasterxml.jackson.annotation.JsonPropertyDescription; import jakarta.validation.constraints.NotEmpty; import jakarta.validation.constraints.NotNull; +import org.opensearch.dataprepper.model.event.EventKey; +import org.opensearch.dataprepper.model.event.EventKeyConfiguration; +import org.opensearch.dataprepper.model.event.EventKeyFactory; + +import java.util.List; public class DeleteEntryProcessorConfig { @NotEmpty @NotNull @JsonProperty("with_keys") - private String[] withKeys; + @EventKeyConfiguration(EventKeyFactory.EventAction.DELETE) + @JsonPropertyDescription("An array of keys for the entries to be deleted.") + private List<@NotNull @NotEmpty EventKey> withKeys; @JsonProperty("delete_when") + @JsonPropertyDescription("Specifies under what condition the `delete_entries` processor should perform deletion. " + + "Default is no condition.") private String deleteWhen; - public String[] getWithKeys() { + public List getWithKeys() { return withKeys; } diff --git a/data-prepper-plugins/mutate-event-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutateevent/RenameKeyProcessorConfig.java b/data-prepper-plugins/mutate-event-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutateevent/RenameKeyProcessorConfig.java index f1e723ad5a..d1ee0178a6 100644 --- a/data-prepper-plugins/mutate-event-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutateevent/RenameKeyProcessorConfig.java +++ b/data-prepper-plugins/mutate-event-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutateevent/RenameKeyProcessorConfig.java @@ -9,6 +9,9 @@ import jakarta.validation.Valid; import jakarta.validation.constraints.NotEmpty; import jakarta.validation.constraints.NotNull; +import org.opensearch.dataprepper.model.event.EventKey; +import org.opensearch.dataprepper.model.event.EventKeyConfiguration; +import org.opensearch.dataprepper.model.event.EventKeyFactory; import java.util.List; @@ -17,12 +20,14 @@ public static class Entry { @NotEmpty @NotNull @JsonProperty("from_key") - private String fromKey; + @EventKeyConfiguration({EventKeyFactory.EventAction.GET, EventKeyFactory.EventAction.DELETE}) + private EventKey fromKey; @NotEmpty @NotNull @JsonProperty("to_key") - private String toKey; + @EventKeyConfiguration(EventKeyFactory.EventAction.PUT) + private EventKey toKey; @JsonProperty("overwrite_if_to_key_exists") private boolean overwriteIfToKeyExists = false; @@ -30,11 +35,11 @@ public static class Entry { @JsonProperty("rename_when") private String renameWhen; - public String getFromKey() { + public EventKey getFromKey() { return fromKey; } - public String getToKey() { + public EventKey getToKey() { return toKey; } @@ -44,7 +49,7 @@ public boolean getOverwriteIfToKeyExists() { public String getRenameWhen() { return renameWhen; } - public Entry(final String fromKey, final String toKey, final boolean overwriteIfKeyExists, final String renameWhen) { + public Entry(final EventKey fromKey, final EventKey toKey, final boolean overwriteIfKeyExists, final String renameWhen) { this.fromKey = fromKey; this.toKey = toKey; this.overwriteIfToKeyExists = overwriteIfKeyExists; diff --git a/data-prepper-plugins/mutate-event-processors/src/test/java/org/opensearch/dataprepper/plugins/processor/mutateevent/DeleteEntryProcessorTests.java b/data-prepper-plugins/mutate-event-processors/src/test/java/org/opensearch/dataprepper/plugins/processor/mutateevent/DeleteEntryProcessorTests.java index 2394a5d958..bc0fb78870 100644 --- a/data-prepper-plugins/mutate-event-processors/src/test/java/org/opensearch/dataprepper/plugins/processor/mutateevent/DeleteEntryProcessorTests.java +++ b/data-prepper-plugins/mutate-event-processors/src/test/java/org/opensearch/dataprepper/plugins/processor/mutateevent/DeleteEntryProcessorTests.java @@ -5,15 +5,17 @@ package org.opensearch.dataprepper.plugins.processor.mutateevent; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.Mock; +import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.dataprepper.event.TestEventKeyFactory; import org.opensearch.dataprepper.expression.ExpressionEvaluator; import org.opensearch.dataprepper.metrics.PluginMetrics; import org.opensearch.dataprepper.model.event.Event; +import org.opensearch.dataprepper.model.event.EventKeyFactory; import org.opensearch.dataprepper.model.event.JacksonEvent; import org.opensearch.dataprepper.model.record.Record; -import org.junit.jupiter.api.Test; -import org.junit.jupiter.api.extension.ExtendWith; -import org.mockito.Mock; -import org.mockito.junit.jupiter.MockitoExtension; import java.util.Collections; import java.util.HashMap; @@ -36,9 +38,11 @@ public class DeleteEntryProcessorTests { @Mock private ExpressionEvaluator expressionEvaluator; + private final EventKeyFactory eventKeyFactory = TestEventKeyFactory.getTestEventFactory(); + @Test public void testSingleDeleteProcessorTest() { - when(mockConfig.getWithKeys()).thenReturn(new String[] { "message" }); + when(mockConfig.getWithKeys()).thenReturn(List.of(eventKeyFactory.createEventKey("message", EventKeyFactory.EventAction.DELETE))); when(mockConfig.getDeleteWhen()).thenReturn(null); final DeleteEntryProcessor processor = createObjectUnderTest(); @@ -52,7 +56,7 @@ public void testSingleDeleteProcessorTest() { @Test public void testWithKeyDneDeleteProcessorTest() { - when(mockConfig.getWithKeys()).thenReturn(new String[] { "message2" }); + when(mockConfig.getWithKeys()).thenReturn(List.of(eventKeyFactory.createEventKey("message2", EventKeyFactory.EventAction.DELETE))); when(mockConfig.getDeleteWhen()).thenReturn(null); final DeleteEntryProcessor processor = createObjectUnderTest(); @@ -67,7 +71,9 @@ public void testWithKeyDneDeleteProcessorTest() { @Test public void testMultiDeleteProcessorTest() { - when(mockConfig.getWithKeys()).thenReturn(new String[] { "message", "message2" }); + when(mockConfig.getWithKeys()).thenReturn(List.of( + eventKeyFactory.createEventKey("message", EventKeyFactory.EventAction.DELETE), + eventKeyFactory.createEventKey("message2", EventKeyFactory.EventAction.DELETE))); when(mockConfig.getDeleteWhen()).thenReturn(null); final DeleteEntryProcessor processor = createObjectUnderTest(); @@ -83,7 +89,7 @@ public void testMultiDeleteProcessorTest() { @Test public void testKeyIsNotDeleted_when_deleteWhen_returns_false() { - when(mockConfig.getWithKeys()).thenReturn(new String[] { "message" }); + when(mockConfig.getWithKeys()).thenReturn(List.of(eventKeyFactory.createEventKey("message", EventKeyFactory.EventAction.DELETE))); final String deleteWhen = UUID.randomUUID().toString(); when(mockConfig.getDeleteWhen()).thenReturn(deleteWhen); @@ -98,8 +104,9 @@ public void testKeyIsNotDeleted_when_deleteWhen_returns_false() { assertThat(editedRecords.get(0).getData().containsKey("newMessage"), is(true)); } + @Test public void testNestedDeleteProcessorTest() { - when(mockConfig.getWithKeys()).thenReturn(new String[]{"nested/foo"}); + when(mockConfig.getWithKeys()).thenReturn(List.of(eventKeyFactory.createEventKey("nested/foo", EventKeyFactory.EventAction.DELETE))); Map nested = Map.of("foo", "bar", "fizz", 42); diff --git a/data-prepper-plugins/mutate-event-processors/src/test/java/org/opensearch/dataprepper/plugins/processor/mutateevent/RenameKeyProcessorTests.java b/data-prepper-plugins/mutate-event-processors/src/test/java/org/opensearch/dataprepper/plugins/processor/mutateevent/RenameKeyProcessorTests.java index dfc5a7b595..6ae362bc46 100644 --- a/data-prepper-plugins/mutate-event-processors/src/test/java/org/opensearch/dataprepper/plugins/processor/mutateevent/RenameKeyProcessorTests.java +++ b/data-prepper-plugins/mutate-event-processors/src/test/java/org/opensearch/dataprepper/plugins/processor/mutateevent/RenameKeyProcessorTests.java @@ -5,9 +5,12 @@ package org.opensearch.dataprepper.plugins.processor.mutateevent; +import org.opensearch.dataprepper.event.TestEventKeyFactory; import org.opensearch.dataprepper.expression.ExpressionEvaluator; import org.opensearch.dataprepper.metrics.PluginMetrics; import org.opensearch.dataprepper.model.event.Event; +import org.opensearch.dataprepper.model.event.EventKey; +import org.opensearch.dataprepper.model.event.EventKeyFactory; import org.opensearch.dataprepper.model.event.JacksonEvent; import org.opensearch.dataprepper.model.record.Record; import org.junit.jupiter.api.Test; @@ -39,6 +42,8 @@ public class RenameKeyProcessorTests { @Mock private ExpressionEvaluator expressionEvaluator; + private final EventKeyFactory eventKeyFactory = TestEventKeyFactory.getTestEventFactory(); + @Test public void testSingleOverwriteRenameProcessorTests() { when(mockConfig.getEntries()).thenReturn(createListOfEntries(createEntry("message", "newMessage", true, null))); @@ -136,7 +141,9 @@ private RenameKeyProcessor createObjectUnderTest() { } private RenameKeyProcessorConfig.Entry createEntry(final String fromKey, final String toKey, final boolean overwriteIfToKeyExists, final String renameWhen) { - return new RenameKeyProcessorConfig.Entry(fromKey, toKey, overwriteIfToKeyExists, renameWhen); + final EventKey fromEventKey = eventKeyFactory.createEventKey(fromKey); + final EventKey toEventKey = eventKeyFactory.createEventKey(toKey); + return new RenameKeyProcessorConfig.Entry(fromEventKey, toEventKey, overwriteIfToKeyExists, renameWhen); } private List createListOfEntries(final RenameKeyProcessorConfig.Entry... entries) { diff --git a/data-prepper-plugins/mutate-string-processors/build.gradle b/data-prepper-plugins/mutate-string-processors/build.gradle index 3fbbc37254..0723e63c10 100644 --- a/data-prepper-plugins/mutate-string-processors/build.gradle +++ b/data-prepper-plugins/mutate-string-processors/build.gradle @@ -22,4 +22,5 @@ dependencies { implementation project(':data-prepper-api') implementation project(':data-prepper-plugins:common') implementation 'com.fasterxml.jackson.core:jackson-databind' + testImplementation project(':data-prepper-test-event') } \ No newline at end of file diff --git a/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/AbstractStringProcessor.java b/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/AbstractStringProcessor.java index 19d11daf62..ae7a242da3 100644 --- a/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/AbstractStringProcessor.java +++ b/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/AbstractStringProcessor.java @@ -8,6 +8,7 @@ import org.opensearch.dataprepper.metrics.PluginMetrics; import org.opensearch.dataprepper.model.annotations.DataPrepperPluginConstructor; import org.opensearch.dataprepper.model.event.Event; +import org.opensearch.dataprepper.model.event.EventKey; import org.opensearch.dataprepper.model.processor.AbstractProcessor; import org.opensearch.dataprepper.model.record.Record; @@ -46,8 +47,8 @@ public Collection> doExecute(final Collection> recor private void performStringAction(final Event recordEvent) { try { - for(T entry : entries) { - final String key = getKey(entry); + for(final T entry : entries) { + final EventKey key = getKey(entry); if(recordEvent.containsKey(key)) { final Object value = recordEvent.get(key, Object.class); @@ -64,7 +65,7 @@ private void performStringAction(final Event recordEvent) protected abstract void performKeyAction(final Event recordEvent, final T entry, final String value); - protected abstract String getKey(final T entry); + protected abstract EventKey getKey(final T entry); @Override public void prepareForShutdown() { diff --git a/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/LowercaseStringProcessor.java b/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/LowercaseStringProcessor.java index b76e922c61..c2c2071e95 100644 --- a/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/LowercaseStringProcessor.java +++ b/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/LowercaseStringProcessor.java @@ -9,6 +9,7 @@ import org.opensearch.dataprepper.model.annotations.DataPrepperPlugin; import org.opensearch.dataprepper.model.annotations.DataPrepperPluginConstructor; import org.opensearch.dataprepper.model.event.Event; +import org.opensearch.dataprepper.model.event.EventKey; import org.opensearch.dataprepper.model.processor.Processor; import java.util.Locale; @@ -18,20 +19,20 @@ * no action is performed. */ @DataPrepperPlugin(name = "lowercase_string", pluginType = Processor.class, pluginConfigurationType = WithKeysConfig.class) -public class LowercaseStringProcessor extends AbstractStringProcessor { +public class LowercaseStringProcessor extends AbstractStringProcessor { @DataPrepperPluginConstructor public LowercaseStringProcessor(final PluginMetrics pluginMetrics, final WithKeysConfig config) { super(pluginMetrics, config); } @Override - protected void performKeyAction(final Event recordEvent, final String key, final String value) + protected void performKeyAction(final Event recordEvent, final EventKey key, final String value) { recordEvent.put(key, value.toLowerCase(Locale.ROOT)); } @Override - protected String getKey(final String entry) { + protected EventKey getKey(final EventKey entry) { return entry; } } diff --git a/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/SplitStringProcessor.java b/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/SplitStringProcessor.java index acac832095..6bc89178d8 100644 --- a/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/SplitStringProcessor.java +++ b/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/SplitStringProcessor.java @@ -10,6 +10,7 @@ import org.opensearch.dataprepper.model.annotations.DataPrepperPlugin; import org.opensearch.dataprepper.model.annotations.DataPrepperPluginConstructor; import org.opensearch.dataprepper.model.event.Event; +import org.opensearch.dataprepper.model.event.EventKey; import org.opensearch.dataprepper.model.processor.Processor; import java.util.HashMap; @@ -64,7 +65,7 @@ protected void performKeyAction(final Event recordEvent, final SplitStringProces } @Override - protected String getKey(final SplitStringProcessorConfig.Entry entry) { + protected EventKey getKey(final SplitStringProcessorConfig.Entry entry) { return entry.getSource(); } diff --git a/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/SplitStringProcessorConfig.java b/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/SplitStringProcessorConfig.java index 84e4228798..cb8edabfb6 100644 --- a/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/SplitStringProcessorConfig.java +++ b/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/SplitStringProcessorConfig.java @@ -7,10 +7,12 @@ import com.fasterxml.jackson.annotation.JsonIgnore; import com.fasterxml.jackson.annotation.JsonProperty; +import com.fasterxml.jackson.annotation.JsonPropertyDescription; import jakarta.validation.Valid; import jakarta.validation.constraints.NotEmpty; import jakarta.validation.constraints.NotNull; import jakarta.validation.constraints.Size; +import org.opensearch.dataprepper.model.event.EventKey; import java.util.List; @@ -19,18 +21,26 @@ public static class Entry { @NotEmpty @NotNull - private String source; + @JsonPropertyDescription("The key to split.") + private EventKey source; @JsonProperty("delimiter_regex") + @JsonPropertyDescription("The regex string responsible for the split. Cannot be defined at the same time as `delimiter`. " + + "At least `delimiter` or `delimiter_regex` must be defined.") private String delimiterRegex; @Size(min = 1, max = 1) + @JsonPropertyDescription("The separator character responsible for the split. " + + "Cannot be defined at the same time as `delimiter_regex`. " + + "At least `delimiter` or `delimiter_regex` must be defined.") private String delimiter; @JsonProperty("split_when") + @JsonPropertyDescription("Specifies under what condition the `split_string` processor should perform splitting. " + + "Default is no condition.") private String splitWhen; - public String getSource() { + public EventKey getSource() { return source; } @@ -44,7 +54,7 @@ public String getDelimiter() { public String getSplitWhen() { return splitWhen; } - public Entry(final String source, final String delimiterRegex, final String delimiter, final String splitWhen) { + public Entry(final EventKey source, final String delimiterRegex, final String delimiter, final String splitWhen) { this.source = source; this.delimiterRegex = delimiterRegex; this.delimiter = delimiter; @@ -60,6 +70,7 @@ public List getIterativeConfig() { return entries; } + @JsonPropertyDescription("List of entries. Valid values are `source`, `delimiter`, and `delimiter_regex`.") private List<@Valid Entry> entries; public List getEntries() { diff --git a/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/SubstituteStringProcessor.java b/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/SubstituteStringProcessor.java index 7332ce836f..e6dceb62fc 100644 --- a/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/SubstituteStringProcessor.java +++ b/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/SubstituteStringProcessor.java @@ -10,6 +10,7 @@ import org.opensearch.dataprepper.model.annotations.DataPrepperPlugin; import org.opensearch.dataprepper.model.annotations.DataPrepperPluginConstructor; import org.opensearch.dataprepper.model.event.Event; +import org.opensearch.dataprepper.model.event.EventKey; import org.opensearch.dataprepper.model.processor.Processor; import java.util.HashMap; @@ -51,7 +52,7 @@ protected void performKeyAction(final Event recordEvent, final SubstituteStringP } @Override - protected String getKey(final SubstituteStringProcessorConfig.Entry entry) { + protected EventKey getKey(final SubstituteStringProcessorConfig.Entry entry) { return entry.getSource(); } } diff --git a/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/SubstituteStringProcessorConfig.java b/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/SubstituteStringProcessorConfig.java index 07789b083a..4a8f53f0fe 100644 --- a/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/SubstituteStringProcessorConfig.java +++ b/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/SubstituteStringProcessorConfig.java @@ -6,19 +6,27 @@ package org.opensearch.dataprepper.plugins.processor.mutatestring; import com.fasterxml.jackson.annotation.JsonProperty; +import com.fasterxml.jackson.annotation.JsonPropertyDescription; +import org.opensearch.dataprepper.model.event.EventKey; import java.util.List; public class SubstituteStringProcessorConfig implements StringProcessorConfig { public static class Entry { - private String source; + @JsonPropertyDescription("The key to modify.") + private EventKey source; + @JsonPropertyDescription("The Regex String to be replaced. Special regex characters such as `[` and `]` must " + + "be escaped using `\\\\` when using double quotes and `\\ ` when using single quotes. " + + "See [Java Patterns](https://docs.oracle.com/en/java/javase/17/docs/api/java.base/java/util/regex/Pattern.html) " + + "for more information.") private String from; + @JsonPropertyDescription("The String to be substituted for each match of `from`.") private String to; @JsonProperty("substitute_when") private String substituteWhen; - public String getSource() { + public EventKey getSource() { return source; } @@ -32,7 +40,7 @@ public String getTo() { public String getSubstituteWhen() { return substituteWhen; } - public Entry(final String source, final String from, final String to, final String substituteWhen) { + public Entry(final EventKey source, final String from, final String to, final String substituteWhen) { this.source = source; this.from = from; this.to = to; @@ -42,6 +50,7 @@ public Entry(final String source, final String from, final String to, final Stri public Entry() {} } + @JsonPropertyDescription("List of entries. Valid values are `source`, `from`, and `to`.") private List entries; public List getEntries() { diff --git a/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/TrimStringProcessor.java b/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/TrimStringProcessor.java index 2f0e5f0dc2..2a1213f30f 100644 --- a/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/TrimStringProcessor.java +++ b/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/TrimStringProcessor.java @@ -9,6 +9,7 @@ import org.opensearch.dataprepper.model.annotations.DataPrepperPlugin; import org.opensearch.dataprepper.model.annotations.DataPrepperPluginConstructor; import org.opensearch.dataprepper.model.event.Event; +import org.opensearch.dataprepper.model.event.EventKey; import org.opensearch.dataprepper.model.processor.Processor; /** @@ -16,20 +17,20 @@ * If the value is not a string, no action is performed. */ @DataPrepperPlugin(name = "trim_string", pluginType = Processor.class, pluginConfigurationType = WithKeysConfig.class) -public class TrimStringProcessor extends AbstractStringProcessor { +public class TrimStringProcessor extends AbstractStringProcessor { @DataPrepperPluginConstructor public TrimStringProcessor(final PluginMetrics pluginMetrics, final WithKeysConfig config) { super(pluginMetrics, config); } @Override - protected void performKeyAction(final Event recordEvent, final String key, final String value) + protected void performKeyAction(final Event recordEvent, final EventKey key, final String value) { recordEvent.put(key, value.trim()); } @Override - protected String getKey(final String entry) { + protected EventKey getKey(final EventKey entry) { return entry; } } diff --git a/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/UppercaseStringProcessor.java b/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/UppercaseStringProcessor.java index 9d3665fdd2..28e7aa9847 100644 --- a/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/UppercaseStringProcessor.java +++ b/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/UppercaseStringProcessor.java @@ -9,6 +9,7 @@ import org.opensearch.dataprepper.model.annotations.DataPrepperPlugin; import org.opensearch.dataprepper.model.annotations.DataPrepperPluginConstructor; import org.opensearch.dataprepper.model.event.Event; +import org.opensearch.dataprepper.model.event.EventKey; import org.opensearch.dataprepper.model.processor.Processor; import java.util.Locale; @@ -18,19 +19,19 @@ * no action is performed. */ @DataPrepperPlugin(name = "uppercase_string", pluginType = Processor.class, pluginConfigurationType = WithKeysConfig.class) -public class UppercaseStringProcessor extends AbstractStringProcessor { +public class UppercaseStringProcessor extends AbstractStringProcessor { @DataPrepperPluginConstructor public UppercaseStringProcessor(final PluginMetrics pluginMetrics, final WithKeysConfig config) { super(pluginMetrics, config); } @Override - protected String getKey(final String entry) { + protected EventKey getKey(final EventKey entry) { return entry; } @Override - protected void performKeyAction(final Event recordEvent, final String entry, final String value) + protected void performKeyAction(final Event recordEvent, final EventKey entry, final String value) { recordEvent.put(entry, value.toUpperCase(Locale.ROOT)); } diff --git a/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/WithKeysConfig.java b/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/WithKeysConfig.java index bfe10d02ca..3660b5d73d 100644 --- a/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/WithKeysConfig.java +++ b/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/WithKeysConfig.java @@ -6,24 +6,27 @@ package org.opensearch.dataprepper.plugins.processor.mutatestring; import com.fasterxml.jackson.annotation.JsonProperty; +import com.fasterxml.jackson.annotation.JsonPropertyDescription; import jakarta.validation.constraints.NotEmpty; import jakarta.validation.constraints.NotNull; +import org.opensearch.dataprepper.model.event.EventKey; import java.util.List; -public class WithKeysConfig implements StringProcessorConfig { +public class WithKeysConfig implements StringProcessorConfig { @NotNull @NotEmpty @JsonProperty("with_keys") - private List withKeys; + @JsonPropertyDescription("A list of keys to trim the white space from.") + private List withKeys; @Override - public List getIterativeConfig() { + public List getIterativeConfig() { return withKeys; } - public List getWithKeys() { + public List getWithKeys() { return withKeys; } } diff --git a/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/WithKeysProcessorConfig.java b/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/WithKeysProcessorConfig.java deleted file mode 100644 index 814518c83d..0000000000 --- a/data-prepper-plugins/mutate-string-processors/src/main/java/org/opensearch/dataprepper/plugins/processor/mutatestring/WithKeysProcessorConfig.java +++ /dev/null @@ -1,28 +0,0 @@ -/* - * Copyright OpenSearch Contributors - * SPDX-License-Identifier: Apache-2.0 - */ - -package org.opensearch.dataprepper.plugins.processor.mutatestring; - -import com.fasterxml.jackson.annotation.JsonProperty; -import jakarta.validation.constraints.NotEmpty; -import jakarta.validation.constraints.NotNull; - -import java.util.List; - -public abstract class WithKeysProcessorConfig implements StringProcessorConfig { - @NotEmpty - @NotNull - @JsonProperty("with_keys") - private List withKeys; - - @Override - public List getIterativeConfig() { - return withKeys; - } - - public List getWithKeys() { - return withKeys; - } -} diff --git a/data-prepper-plugins/mutate-string-processors/src/test/java/org/opensearch/dataprepper/plugins/processor/mutatestring/LowercaseStringProcessorTests.java b/data-prepper-plugins/mutate-string-processors/src/test/java/org/opensearch/dataprepper/plugins/processor/mutatestring/LowercaseStringProcessorTests.java index 18bddf31a9..8185d8ef8c 100644 --- a/data-prepper-plugins/mutate-string-processors/src/test/java/org/opensearch/dataprepper/plugins/processor/mutatestring/LowercaseStringProcessorTests.java +++ b/data-prepper-plugins/mutate-string-processors/src/test/java/org/opensearch/dataprepper/plugins/processor/mutatestring/LowercaseStringProcessorTests.java @@ -5,21 +5,26 @@ package org.opensearch.dataprepper.plugins.processor.mutatestring; -import org.opensearch.dataprepper.metrics.PluginMetrics; -import org.opensearch.dataprepper.model.event.Event; -import org.opensearch.dataprepper.model.event.JacksonEvent; -import org.opensearch.dataprepper.model.record.Record; import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; import org.mockito.Mock; import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.dataprepper.event.TestEventFactory; +import org.opensearch.dataprepper.event.TestEventKeyFactory; +import org.opensearch.dataprepper.metrics.PluginMetrics; +import org.opensearch.dataprepper.model.event.Event; +import org.opensearch.dataprepper.model.event.EventBuilder; +import org.opensearch.dataprepper.model.event.EventFactory; +import org.opensearch.dataprepper.model.event.EventKeyFactory; +import org.opensearch.dataprepper.model.record.Record; -import java.util.Arrays; import java.util.Collections; import java.util.HashMap; import java.util.List; import java.util.Map; +import java.util.stream.Collectors; +import java.util.stream.Stream; import static org.hamcrest.CoreMatchers.equalTo; import static org.hamcrest.CoreMatchers.is; @@ -29,6 +34,9 @@ @ExtendWith(MockitoExtension.class) public class LowercaseStringProcessorTests { + private static final EventFactory TEST_EVENT_FACTORY = TestEventFactory.getTestEventFactory(); + private final EventKeyFactory eventKeyFactory = TestEventKeyFactory.getTestEventFactory(); + @Mock private PluginMetrics pluginMetrics; @@ -37,7 +45,7 @@ public class LowercaseStringProcessorTests { @BeforeEach public void setup() { - lenient().when(config.getIterativeConfig()).thenReturn(Collections.singletonList("message")); + lenient().when(config.getIterativeConfig()).thenReturn(Stream.of("message").map(eventKeyFactory::createEventKey).collect(Collectors.toList())); } @Test @@ -52,7 +60,7 @@ public void testHappyPathLowercaseStringProcessor() { @Test public void testHappyPathMultiLowercaseStringProcessor() { - when(config.getIterativeConfig()).thenReturn(Arrays.asList("message", "message2")); + when(config.getIterativeConfig()).thenReturn(Stream.of("message", "message2").map(eventKeyFactory::createEventKey).collect(Collectors.toList())); final LowercaseStringProcessor processor = createObjectUnderTest(); final Record record = getEvent("THISISAMESSAGE"); @@ -67,7 +75,7 @@ public void testHappyPathMultiLowercaseStringProcessor() { @Test public void testHappyPathMultiMixedLowercaseStringProcessor() { - lenient().when(config.getIterativeConfig()).thenReturn(Arrays.asList("message", "message2")); + lenient().when(config.getIterativeConfig()).thenReturn(Stream.of("message", "message2").map(eventKeyFactory::createEventKey).collect(Collectors.toList())); final LowercaseStringProcessor processor = createObjectUnderTest(); final Record record = getEvent("THISISAMESSAGE"); @@ -137,7 +145,7 @@ private Record getEvent(Object message) { } private static Record buildRecordWithEvent(final Map data) { - return new Record<>(JacksonEvent.builder() + return new Record<>(TEST_EVENT_FACTORY.eventBuilder(EventBuilder.class) .withData(data) .withEventType("event") .build()); diff --git a/data-prepper-plugins/mutate-string-processors/src/test/java/org/opensearch/dataprepper/plugins/processor/mutatestring/SplitStringProcessorTests.java b/data-prepper-plugins/mutate-string-processors/src/test/java/org/opensearch/dataprepper/plugins/processor/mutatestring/SplitStringProcessorTests.java index 1f2db4a672..7883dcfd05 100644 --- a/data-prepper-plugins/mutate-string-processors/src/test/java/org/opensearch/dataprepper/plugins/processor/mutatestring/SplitStringProcessorTests.java +++ b/data-prepper-plugins/mutate-string-processors/src/test/java/org/opensearch/dataprepper/plugins/processor/mutatestring/SplitStringProcessorTests.java @@ -5,10 +5,15 @@ package org.opensearch.dataprepper.plugins.processor.mutatestring; +import org.opensearch.dataprepper.event.TestEventFactory; +import org.opensearch.dataprepper.event.TestEventKeyFactory; import org.opensearch.dataprepper.expression.ExpressionEvaluator; import org.opensearch.dataprepper.metrics.PluginMetrics; import org.opensearch.dataprepper.model.event.Event; -import org.opensearch.dataprepper.model.event.JacksonEvent; +import org.opensearch.dataprepper.model.event.EventBuilder; +import org.opensearch.dataprepper.model.event.EventFactory; +import org.opensearch.dataprepper.model.event.EventKey; +import org.opensearch.dataprepper.model.event.EventKeyFactory; import org.opensearch.dataprepper.model.record.Record; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; @@ -36,6 +41,8 @@ @ExtendWith(MockitoExtension.class) class SplitStringProcessorTests { + private final EventFactory testEventFactory = TestEventFactory.getTestEventFactory(); + private final EventKeyFactory eventKeyFactory = TestEventKeyFactory.getTestEventFactory(); @Mock private PluginMetrics pluginMetrics; @@ -115,13 +122,14 @@ void test_event_is_the_same_when_splitWhen_condition_returns_false() { private SplitStringProcessorConfig.Entry createEntry(final String source, final String delimiterRegex, final String delimiter, final String splitWhen) { - return new SplitStringProcessorConfig.Entry(source, delimiterRegex, delimiter, splitWhen); + final EventKey sourceKey = eventKeyFactory.createEventKey(source); + return new SplitStringProcessorConfig.Entry(sourceKey, delimiterRegex, delimiter, splitWhen); } private Record createEvent(final String message) { final Map eventData = new HashMap<>(); eventData.put("message", message); - return new Record<>(JacksonEvent.builder() + return new Record<>(testEventFactory.eventBuilder(EventBuilder.class) .withEventType("event") .withData(eventData) .build()); diff --git a/data-prepper-plugins/mutate-string-processors/src/test/java/org/opensearch/dataprepper/plugins/processor/mutatestring/SubstituteStringProcessorTests.java b/data-prepper-plugins/mutate-string-processors/src/test/java/org/opensearch/dataprepper/plugins/processor/mutatestring/SubstituteStringProcessorTests.java index 04175ee229..dd8d9b1dd8 100644 --- a/data-prepper-plugins/mutate-string-processors/src/test/java/org/opensearch/dataprepper/plugins/processor/mutatestring/SubstituteStringProcessorTests.java +++ b/data-prepper-plugins/mutate-string-processors/src/test/java/org/opensearch/dataprepper/plugins/processor/mutatestring/SubstituteStringProcessorTests.java @@ -5,10 +5,15 @@ package org.opensearch.dataprepper.plugins.processor.mutatestring; +import org.opensearch.dataprepper.event.TestEventFactory; +import org.opensearch.dataprepper.event.TestEventKeyFactory; import org.opensearch.dataprepper.expression.ExpressionEvaluator; import org.opensearch.dataprepper.metrics.PluginMetrics; import org.opensearch.dataprepper.model.event.Event; -import org.opensearch.dataprepper.model.event.JacksonEvent; +import org.opensearch.dataprepper.model.event.EventBuilder; +import org.opensearch.dataprepper.model.event.EventFactory; +import org.opensearch.dataprepper.model.event.EventKey; +import org.opensearch.dataprepper.model.event.EventKeyFactory; import org.opensearch.dataprepper.model.record.Record; import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Test; @@ -33,6 +38,8 @@ @ExtendWith(MockitoExtension.class) public class SubstituteStringProcessorTests { + private static final EventFactory TEST_EVENT_FACTORY = TestEventFactory.getTestEventFactory(); + private final EventKeyFactory eventKeyFactory = TestEventKeyFactory.getTestEventFactory(); @Mock private PluginMetrics pluginMetrics; @@ -42,6 +49,7 @@ public class SubstituteStringProcessorTests { @Mock private ExpressionEvaluator expressionEvaluator; + @BeforeEach public void setup() { lenient().when(config.getIterativeConfig()).thenReturn(Collections.singletonList(createEntry("message", "a", "b", null))); @@ -181,7 +189,8 @@ public boolean equals(Object other) { } private SubstituteStringProcessorConfig.Entry createEntry(final String source, final String from, final String to, final String substituteWhen) { - final SubstituteStringProcessorConfig.Entry entry = new SubstituteStringProcessorConfig.Entry(source, from, to, substituteWhen); + final EventKey sourceKey = eventKeyFactory.createEventKey(source); + final SubstituteStringProcessorConfig.Entry entry = new SubstituteStringProcessorConfig.Entry(sourceKey, from, to, substituteWhen); return entry; } @@ -197,7 +206,7 @@ private Record getEvent(Object message) { } private static Record buildRecordWithEvent(final Map data) { - return new Record<>(JacksonEvent.builder() + return new Record<>(TEST_EVENT_FACTORY.eventBuilder(EventBuilder.class) .withData(data) .withEventType("event") .build()); diff --git a/data-prepper-plugins/mutate-string-processors/src/test/java/org/opensearch/dataprepper/plugins/processor/mutatestring/TrimStringProcessorTests.java b/data-prepper-plugins/mutate-string-processors/src/test/java/org/opensearch/dataprepper/plugins/processor/mutatestring/TrimStringProcessorTests.java index 06efbbad96..921f6a6094 100644 --- a/data-prepper-plugins/mutate-string-processors/src/test/java/org/opensearch/dataprepper/plugins/processor/mutatestring/TrimStringProcessorTests.java +++ b/data-prepper-plugins/mutate-string-processors/src/test/java/org/opensearch/dataprepper/plugins/processor/mutatestring/TrimStringProcessorTests.java @@ -5,21 +5,26 @@ package org.opensearch.dataprepper.plugins.processor.mutatestring; -import org.opensearch.dataprepper.metrics.PluginMetrics; -import org.opensearch.dataprepper.model.event.Event; -import org.opensearch.dataprepper.model.event.JacksonEvent; -import org.opensearch.dataprepper.model.record.Record; import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; import org.mockito.Mock; import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.dataprepper.event.TestEventFactory; +import org.opensearch.dataprepper.event.TestEventKeyFactory; +import org.opensearch.dataprepper.metrics.PluginMetrics; +import org.opensearch.dataprepper.model.event.Event; +import org.opensearch.dataprepper.model.event.EventBuilder; +import org.opensearch.dataprepper.model.event.EventFactory; +import org.opensearch.dataprepper.model.event.EventKeyFactory; +import org.opensearch.dataprepper.model.record.Record; -import java.util.Arrays; import java.util.Collections; import java.util.HashMap; import java.util.List; import java.util.Map; +import java.util.stream.Collectors; +import java.util.stream.Stream; import static org.hamcrest.CoreMatchers.equalTo; import static org.hamcrest.CoreMatchers.is; @@ -29,6 +34,8 @@ @ExtendWith(MockitoExtension.class) public class TrimStringProcessorTests { + private static final EventFactory TEST_EVENT_FACTORY = TestEventFactory.getTestEventFactory(); + private final EventKeyFactory eventKeyFactory = TestEventKeyFactory.getTestEventFactory(); @Mock private PluginMetrics pluginMetrics; @@ -37,7 +44,7 @@ public class TrimStringProcessorTests { @BeforeEach public void setup() { - lenient().when(config.getIterativeConfig()).thenReturn(Collections.singletonList("message")); + lenient().when(config.getIterativeConfig()).thenReturn(Stream.of("message").map(eventKeyFactory::createEventKey).collect(Collectors.toList())); } @Test @@ -62,7 +69,7 @@ public void testSpaceInMiddleTrimStringProcessor() { @Test public void testHappyPathMultiTrimStringProcessor() { - when(config.getIterativeConfig()).thenReturn(Arrays.asList("message", "message2")); + when(config.getIterativeConfig()).thenReturn(Stream.of("message", "message2").map(eventKeyFactory::createEventKey).collect(Collectors.toList())); final TrimStringProcessor processor = createObjectUnderTest(); final Record record = getEvent("thisisamessage "); @@ -77,7 +84,7 @@ public void testHappyPathMultiTrimStringProcessor() { @Test public void testHappyPathMultiMixedTrimStringProcessor() { - lenient().when(config.getIterativeConfig()).thenReturn(Arrays.asList("message", "message2")); + lenient().when(config.getIterativeConfig()).thenReturn(Stream.of("message", "message2").map(eventKeyFactory::createEventKey).collect(Collectors.toList())); final TrimStringProcessor processor = createObjectUnderTest(); final Record record = getEvent("thisisamessage "); @@ -147,7 +154,7 @@ private Record getEvent(Object message) { } private static Record buildRecordWithEvent(final Map data) { - return new Record<>(JacksonEvent.builder() + return new Record<>(TEST_EVENT_FACTORY.eventBuilder(EventBuilder.class) .withData(data) .withEventType("event") .build()); diff --git a/data-prepper-plugins/mutate-string-processors/src/test/java/org/opensearch/dataprepper/plugins/processor/mutatestring/UppercaseStringProcessorTests.java b/data-prepper-plugins/mutate-string-processors/src/test/java/org/opensearch/dataprepper/plugins/processor/mutatestring/UppercaseStringProcessorTests.java index 14af79d202..c4db6a55e5 100644 --- a/data-prepper-plugins/mutate-string-processors/src/test/java/org/opensearch/dataprepper/plugins/processor/mutatestring/UppercaseStringProcessorTests.java +++ b/data-prepper-plugins/mutate-string-processors/src/test/java/org/opensearch/dataprepper/plugins/processor/mutatestring/UppercaseStringProcessorTests.java @@ -5,21 +5,26 @@ package org.opensearch.dataprepper.plugins.processor.mutatestring; -import org.opensearch.dataprepper.metrics.PluginMetrics; -import org.opensearch.dataprepper.model.event.Event; -import org.opensearch.dataprepper.model.event.JacksonEvent; -import org.opensearch.dataprepper.model.record.Record; import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; import org.mockito.Mock; import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.dataprepper.event.TestEventFactory; +import org.opensearch.dataprepper.event.TestEventKeyFactory; +import org.opensearch.dataprepper.metrics.PluginMetrics; +import org.opensearch.dataprepper.model.event.Event; +import org.opensearch.dataprepper.model.event.EventBuilder; +import org.opensearch.dataprepper.model.event.EventFactory; +import org.opensearch.dataprepper.model.event.EventKeyFactory; +import org.opensearch.dataprepper.model.record.Record; -import java.util.Arrays; import java.util.Collections; import java.util.HashMap; import java.util.List; import java.util.Map; +import java.util.stream.Collectors; +import java.util.stream.Stream; import static org.hamcrest.CoreMatchers.equalTo; import static org.hamcrest.CoreMatchers.is; @@ -29,6 +34,9 @@ @ExtendWith(MockitoExtension.class) public class UppercaseStringProcessorTests { + private static final EventFactory TEST_EVENT_FACTORY = TestEventFactory.getTestEventFactory(); + private final EventKeyFactory eventKeyFactory = TestEventKeyFactory.getTestEventFactory(); + @Mock private PluginMetrics pluginMetrics; @@ -37,7 +45,7 @@ public class UppercaseStringProcessorTests { @BeforeEach public void setup() { - lenient().when(config.getIterativeConfig()).thenReturn(Collections.singletonList("message")); + lenient().when(config.getIterativeConfig()).thenReturn(Stream.of("message").map(eventKeyFactory::createEventKey).collect(Collectors.toList())); } @Test @@ -52,7 +60,7 @@ public void testHappyPathUppercaseStringProcessor() { @Test public void testHappyPathMultiUppercaseStringProcessor() { - when(config.getIterativeConfig()).thenReturn(Arrays.asList("message", "message2")); + when(config.getIterativeConfig()).thenReturn(Stream.of("message", "message2").map(eventKeyFactory::createEventKey).collect(Collectors.toList())); final UppercaseStringProcessor processor = createObjectUnderTest(); final Record record = getEvent("thisisamessage"); @@ -67,7 +75,7 @@ public void testHappyPathMultiUppercaseStringProcessor() { @Test public void testHappyPathMultiMixedUppercaseStringProcessor() { - lenient().when(config.getIterativeConfig()).thenReturn(Arrays.asList("message", "message2")); + lenient().when(config.getIterativeConfig()).thenReturn(Stream.of("message", "message2").map(eventKeyFactory::createEventKey).collect(Collectors.toList())); final UppercaseStringProcessor processor = createObjectUnderTest(); final Record record = getEvent("thisisamessage"); @@ -137,7 +145,7 @@ private Record getEvent(Object message) { } private static Record buildRecordWithEvent(final Map data) { - return new Record<>(JacksonEvent.builder() + return new Record<>(TEST_EVENT_FACTORY.eventBuilder(EventBuilder.class) .withData(data) .withEventType("event") .build()); diff --git a/data-prepper-plugins/newline-codecs/build.gradle b/data-prepper-plugins/newline-codecs/build.gradle index b504ed30ee..c71e8755ef 100644 --- a/data-prepper-plugins/newline-codecs/build.gradle +++ b/data-prepper-plugins/newline-codecs/build.gradle @@ -5,7 +5,7 @@ plugins { dependencies { implementation project(':data-prepper-api') implementation 'com.fasterxml.jackson.core:jackson-annotations' - implementation 'org.apache.parquet:parquet-common:1.14.0' + implementation libs.parquet.common testImplementation project(':data-prepper-plugins:common') testImplementation project(':data-prepper-test-event') } diff --git a/data-prepper-plugins/obfuscate-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/obfuscation/ObfuscationProcessorConfig.java b/data-prepper-plugins/obfuscate-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/obfuscation/ObfuscationProcessorConfig.java index b99753bc9f..e5893476e0 100644 --- a/data-prepper-plugins/obfuscate-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/obfuscation/ObfuscationProcessorConfig.java +++ b/data-prepper-plugins/obfuscate-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/obfuscation/ObfuscationProcessorConfig.java @@ -6,6 +6,7 @@ package org.opensearch.dataprepper.plugins.processor.obfuscation; import com.fasterxml.jackson.annotation.JsonProperty; +import com.fasterxml.jackson.annotation.JsonPropertyDescription; import jakarta.validation.constraints.NotEmpty; import jakarta.validation.constraints.NotNull; import org.opensearch.dataprepper.expression.ExpressionEvaluator; @@ -17,6 +18,7 @@ public class ObfuscationProcessorConfig { @JsonProperty("source") + @JsonPropertyDescription("The source field to obfuscate.") @NotEmpty @NotNull private String source; @@ -25,18 +27,29 @@ public class ObfuscationProcessorConfig { private List patterns; @JsonProperty("target") + @JsonPropertyDescription("The new field in which to store the obfuscated value. " + + "This leaves the original source field unchanged. " + + "When no `target` is provided, the source field updates with the obfuscated value.") private String target; @JsonProperty("action") + @JsonPropertyDescription("The obfuscation action. As of Data Prepper 2.3, only the `mask` action is supported.") private PluginModel action; @JsonProperty("obfuscate_when") + @JsonPropertyDescription("Specifies under what condition the Obfuscate processor should perform matching. " + + "Default is no condition.") private String obfuscateWhen; @JsonProperty("tags_on_match_failure") + @JsonPropertyDescription("The tag to add to an event if the obfuscate processor fails to match the pattern.") private List tagsOnMatchFailure; @JsonProperty("single_word_only") + @JsonPropertyDescription("When set to `true`, a word boundary `\b` is added to the pattern, " + + "which causes obfuscation to be applied only to words that are standalone in the input text. " + + "By default, it is false, meaning obfuscation patterns are applied to all occurrences. " + + "Can be used for Data Prepper 2.8 or greater.") private boolean singleWordOnly = false; public ObfuscationProcessorConfig() { diff --git a/data-prepper-plugins/opensearch/README.md b/data-prepper-plugins/opensearch/README.md index f6f7b1a3ea..628a75cc80 100644 --- a/data-prepper-plugins/opensearch/README.md +++ b/data-prepper-plugins/opensearch/README.md @@ -238,6 +238,8 @@ the flush timeout and instead flush whatever is present at the end of each batch bulk_size: 4 ``` +- `pipeline` (optional): A string which is used to represent the pipeline Id for preprocessing documents. Each incoming record is searched for this field and if it is present, it is used as the pipeline field for the document. Standard Data Prepper Json pointer syntax is used for retrieving the value. If the field has "/" in it then the incoming record is searched in the json sub-objects instead of just in the root of the json object. For example, if the field is specified as `info/id`, then the root of the event is searched for `info` and if it is found, then `id` is searched inside it. The value specified for `id` is used as the pipeline id. This field can also be a Data Prepper expression that is evaluated to determine the `pipeline_id`. For example, setting to `getMetadata(\"some_metadata_key\")` will use the value of the metadata key as the pipeline_id. + - `ism_policy_file` (optional): A String of absolute file path or AWS S3 URI for an ISM (Index State Management) policy JSON file. This policy file is effective only when there is no built-in policy file for the index type. For example, `custom` index type is currently the only one without a built-in policy file, thus it would use the policy file here if it's provided through this parameter. OpenSearch documentation has more about [ISM policies.](https://opensearch.org/docs/latest/im-plugin/ism/policies/) - `s3_aws_region` (optional): A String represents the region of S3 bucket to read `template_file` or `ism_policy_file`, e.g. us-west-2. Only applies to Amazon OpenSearch Service. Defaults to `us-east-1`. diff --git a/data-prepper-plugins/opensearch/build.gradle b/data-prepper-plugins/opensearch/build.gradle index 1d5be32d00..5e7879d8d1 100644 --- a/data-prepper-plugins/opensearch/build.gradle +++ b/data-prepper-plugins/opensearch/build.gradle @@ -32,19 +32,18 @@ dependencies { implementation 'software.amazon.awssdk:s3' implementation 'software.amazon.awssdk:opensearchserverless' implementation libs.commons.lang3 - implementation 'com.github.ben-manes.caffeine:caffeine:3.1.8' + implementation libs.caffeine implementation 'software.amazon.awssdk:apache-client' implementation 'software.amazon.awssdk:netty-nio-client' implementation 'co.elastic.clients:elasticsearch-java:7.17.0' - implementation('org.apache.maven:maven-artifact:3.9.6') { + implementation('org.apache.maven:maven-artifact:3.9.8') { exclude group: 'org.codehaus.plexus' } testImplementation testLibs.junit.vintage testImplementation libs.commons.io - testImplementation 'net.bytebuddy:byte-buddy:1.14.12' - testImplementation 'net.bytebuddy:byte-buddy-agent:1.14.12' + testImplementation 'net.bytebuddy:byte-buddy:1.14.17' + testImplementation 'net.bytebuddy:byte-buddy-agent:1.14.17' testImplementation testLibs.slf4j.simple - testImplementation testLibs.mockito.inline } sourceSets { diff --git a/data-prepper-plugins/opensearch/src/integrationTest/java/org/opensearch/dataprepper/plugins/sink/opensearch/OpenSearchSinkIT.java b/data-prepper-plugins/opensearch/src/integrationTest/java/org/opensearch/dataprepper/plugins/sink/opensearch/OpenSearchSinkIT.java index 0e0bc87cb4..b17c0ea47c 100644 --- a/data-prepper-plugins/opensearch/src/integrationTest/java/org/opensearch/dataprepper/plugins/sink/opensearch/OpenSearchSinkIT.java +++ b/data-prepper-plugins/opensearch/src/integrationTest/java/org/opensearch/dataprepper/plugins/sink/opensearch/OpenSearchSinkIT.java @@ -303,7 +303,7 @@ public void testOutputRawSpanDefault(final boolean estimateBulkSizeUsingCompress .add(OpenSearchSink.BULKREQUEST_SIZE_BYTES).toString()); assertThat(bulkRequestSizeBytesMetrics.size(), equalTo(3)); assertThat(bulkRequestSizeBytesMetrics.get(0).getValue(), closeTo(1.0, 0)); - final double expectedBulkRequestSizeBytes = isRequestCompressionEnabled && estimateBulkSizeUsingCompression ? 773.0 : 2058.0; + final double expectedBulkRequestSizeBytes = isRequestCompressionEnabled && estimateBulkSizeUsingCompression ? 792.0 : 2058.0; assertThat(bulkRequestSizeBytesMetrics.get(1).getValue(), closeTo(expectedBulkRequestSizeBytes, 0)); assertThat(bulkRequestSizeBytesMetrics.get(2).getValue(), closeTo(expectedBulkRequestSizeBytes, 0)); } @@ -364,7 +364,7 @@ public void testOutputRawSpanWithDLQ(final boolean estimateBulkSizeUsingCompress .add(OpenSearchSink.BULKREQUEST_SIZE_BYTES).toString()); assertThat(bulkRequestSizeBytesMetrics.size(), equalTo(3)); assertThat(bulkRequestSizeBytesMetrics.get(0).getValue(), closeTo(1.0, 0)); - final double expectedBulkRequestSizeBytes = isRequestCompressionEnabled && estimateBulkSizeUsingCompression ? 1066.0 : 2072.0; + final double expectedBulkRequestSizeBytes = isRequestCompressionEnabled && estimateBulkSizeUsingCompression ? 1078.0 : 2072.0; assertThat(bulkRequestSizeBytesMetrics.get(1).getValue(), closeTo(expectedBulkRequestSizeBytes, 0)); assertThat(bulkRequestSizeBytesMetrics.get(2).getValue(), closeTo(expectedBulkRequestSizeBytes, 0)); @@ -426,7 +426,7 @@ public void testOutputServiceMapDefault(final boolean estimateBulkSizeUsingCompr .add(OpenSearchSink.BULKREQUEST_SIZE_BYTES).toString()); assertThat(bulkRequestSizeBytesMetrics.size(), equalTo(3)); assertThat(bulkRequestSizeBytesMetrics.get(0).getValue(), closeTo(1.0, 0)); - final double expectedBulkRequestSizeBytes = isRequestCompressionEnabled && estimateBulkSizeUsingCompression ? 366.0 : 265.0; + final double expectedBulkRequestSizeBytes = isRequestCompressionEnabled && estimateBulkSizeUsingCompression ? 376.0 : 265.0; assertThat(bulkRequestSizeBytesMetrics.get(1).getValue(), closeTo(expectedBulkRequestSizeBytes, 0)); assertThat(bulkRequestSizeBytesMetrics.get(2).getValue(), closeTo(expectedBulkRequestSizeBytes, 0)); diff --git a/data-prepper-plugins/opensearch/src/main/java/org/opensearch/dataprepper/plugins/sink/opensearch/OpenSearchSink.java b/data-prepper-plugins/opensearch/src/main/java/org/opensearch/dataprepper/plugins/sink/opensearch/OpenSearchSink.java index e1547a925c..199b4e1e0e 100644 --- a/data-prepper-plugins/opensearch/src/main/java/org/opensearch/dataprepper/plugins/sink/opensearch/OpenSearchSink.java +++ b/data-prepper-plugins/opensearch/src/main/java/org/opensearch/dataprepper/plugins/sink/opensearch/OpenSearchSink.java @@ -115,6 +115,7 @@ public class OpenSearchSink extends AbstractSink> { private final String documentId; private final String routingField; private final String routing; + private final String pipeline; private final String action; private final List> actions; private final String documentRootKey; @@ -170,6 +171,7 @@ public OpenSearchSink(final PluginSetting pluginSetting, this.documentId = openSearchSinkConfig.getIndexConfiguration().getDocumentId(); this.routingField = openSearchSinkConfig.getIndexConfiguration().getRoutingField(); this.routing = openSearchSinkConfig.getIndexConfiguration().getRouting(); + this.pipeline = openSearchSinkConfig.getIndexConfiguration().getPipeline(); this.action = openSearchSinkConfig.getIndexConfiguration().getAction(); this.actions = openSearchSinkConfig.getIndexConfiguration().getActions(); this.documentRootKey = openSearchSinkConfig.getIndexConfiguration().getDocumentRootKey(); @@ -299,6 +301,7 @@ private BulkOperation getBulkOperationForAction(final String action, BulkOperation bulkOperation; final Optional docId = document.getDocumentId(); final Optional routing = document.getRoutingField(); + final Optional pipeline = document.getPipelineField(); if (StringUtils.equals(action, OpenSearchBulkActions.CREATE.toString())) { final CreateOperation.Builder createOperationBuilder = @@ -307,6 +310,8 @@ private BulkOperation getBulkOperationForAction(final String action, .document(document); docId.ifPresent(createOperationBuilder::id); routing.ifPresent(createOperationBuilder::routing); + pipeline.ifPresent(createOperationBuilder::pipeline); + bulkOperation = new BulkOperation.Builder() .create(createOperationBuilder.build()) .build(); @@ -367,6 +372,7 @@ private BulkOperation getBulkOperationForAction(final String action, .versionType(versionType); docId.ifPresent(indexOperationBuilder::id); routing.ifPresent(indexOperationBuilder::routing); + pipeline.ifPresent(indexOperationBuilder::pipeline); bulkOperation = new BulkOperation.Builder() .index(indexOperationBuilder.build()) .build(); @@ -502,9 +508,21 @@ SerializedJson getDocument(final Event event) { } } + String pipelineValue = null; + if (pipeline != null) { + try { + pipelineValue = event.formatString(pipeline, expressionEvaluator); + } catch (final ExpressionEvaluationException | EventKeyNotFoundException e) { + LOG.error("Unable to construct pipeline with format {}", pipeline, e); + } + if (StringUtils.isEmpty(pipelineValue) || StringUtils.isBlank(pipelineValue)) { + pipelineValue = null; + } + } + final String document = DocumentBuilder.build(event, documentRootKey, sinkContext.getTagsTargetKey(), sinkContext.getIncludeKeys(), sinkContext.getExcludeKeys()); - return SerializedJson.fromStringAndOptionals(document, docId, routingValue); + return SerializedJson.fromStringAndOptionals(document, docId, routingValue, pipelineValue); } private void flushBatch(AccumulatingBulkRequest accumulatingBulkRequest) { diff --git a/data-prepper-plugins/opensearch/src/main/java/org/opensearch/dataprepper/plugins/sink/opensearch/bulk/SerializedJson.java b/data-prepper-plugins/opensearch/src/main/java/org/opensearch/dataprepper/plugins/sink/opensearch/bulk/SerializedJson.java index 671a4d0423..d85a5992c5 100644 --- a/data-prepper-plugins/opensearch/src/main/java/org/opensearch/dataprepper/plugins/sink/opensearch/bulk/SerializedJson.java +++ b/data-prepper-plugins/opensearch/src/main/java/org/opensearch/dataprepper/plugins/sink/opensearch/bulk/SerializedJson.java @@ -17,6 +17,7 @@ public interface SerializedJson extends SizedDocument { byte[] getSerializedJson(); Optional getDocumentId(); Optional getRoutingField(); + Optional getPipelineField(); /** * Creates a new {@link SerializedJson} from a JSON string and optional documentId and routingField. @@ -26,9 +27,9 @@ public interface SerializedJson extends SizedDocument { * @param routingField Optional routing field string * @return A new {@link SerializedJson}. */ - static SerializedJson fromStringAndOptionals(String jsonString, String docId, String routingField) { + static SerializedJson fromStringAndOptionals(String jsonString, String docId, String routingField, String pipelineField) { Objects.requireNonNull(jsonString); - return new SerializedJsonImpl(jsonString.getBytes(StandardCharsets.UTF_8), docId, routingField); + return new SerializedJsonImpl(jsonString.getBytes(StandardCharsets.UTF_8), docId, routingField, pipelineField); } static SerializedJson fromJsonNode(final JsonNode jsonNode, SerializedJson document) { diff --git a/data-prepper-plugins/opensearch/src/main/java/org/opensearch/dataprepper/plugins/sink/opensearch/bulk/SerializedJsonImpl.java b/data-prepper-plugins/opensearch/src/main/java/org/opensearch/dataprepper/plugins/sink/opensearch/bulk/SerializedJsonImpl.java index 06a26cba65..2f1fbd2b96 100644 --- a/data-prepper-plugins/opensearch/src/main/java/org/opensearch/dataprepper/plugins/sink/opensearch/bulk/SerializedJsonImpl.java +++ b/data-prepper-plugins/opensearch/src/main/java/org/opensearch/dataprepper/plugins/sink/opensearch/bulk/SerializedJsonImpl.java @@ -12,11 +12,13 @@ class SerializedJsonImpl implements SerializedJson, Serializable { private byte[] document; private String documentId = null; private String routingField = null; + private String pipelineField = null; - public SerializedJsonImpl(final byte[] document, String docId, String routingField) { + public SerializedJsonImpl(final byte[] document, String docId, String routingField, String pipelineField) { this.document = document; this.documentId = docId; this.routingField = routingField; + this.pipelineField = pipelineField; } public SerializedJsonImpl(final byte[] document) { @@ -42,4 +44,7 @@ public Optional getDocumentId() { public Optional getRoutingField() { return Optional.ofNullable(routingField); } + + @Override + public Optional getPipelineField() { return Optional.ofNullable(pipelineField); } } diff --git a/data-prepper-plugins/opensearch/src/main/java/org/opensearch/dataprepper/plugins/sink/opensearch/bulk/SerializedJsonNode.java b/data-prepper-plugins/opensearch/src/main/java/org/opensearch/dataprepper/plugins/sink/opensearch/bulk/SerializedJsonNode.java index 7e56839d8c..41d9459347 100644 --- a/data-prepper-plugins/opensearch/src/main/java/org/opensearch/dataprepper/plugins/sink/opensearch/bulk/SerializedJsonNode.java +++ b/data-prepper-plugins/opensearch/src/main/java/org/opensearch/dataprepper/plugins/sink/opensearch/bulk/SerializedJsonNode.java @@ -14,12 +14,14 @@ class SerializedJsonNode implements SerializedJson, Serializable { private JsonNode jsonNode; private String documentId = null; private String routingField = null; + private String pipelineField = null; public SerializedJsonNode(final JsonNode jsonNode, SerializedJson doc) { this.jsonNode = jsonNode; this.documentId = doc.getDocumentId().orElse(null); this.routingField = doc.getRoutingField().orElse(null); this.document = jsonNode.toString().getBytes(); + this.pipelineField = doc.getPipelineField().orElse(null);; } public SerializedJsonNode(final JsonNode jsonNode) { @@ -46,4 +48,7 @@ public Optional getDocumentId() { public Optional getRoutingField() { return Optional.ofNullable(routingField); } + + @Override + public Optional getPipelineField() { return Optional.ofNullable(pipelineField); } } diff --git a/data-prepper-plugins/opensearch/src/main/java/org/opensearch/dataprepper/plugins/sink/opensearch/index/IndexConfiguration.java b/data-prepper-plugins/opensearch/src/main/java/org/opensearch/dataprepper/plugins/sink/opensearch/index/IndexConfiguration.java index b3bbb213b2..21b178ea9c 100644 --- a/data-prepper-plugins/opensearch/src/main/java/org/opensearch/dataprepper/plugins/sink/opensearch/index/IndexConfiguration.java +++ b/data-prepper-plugins/opensearch/src/main/java/org/opensearch/dataprepper/plugins/sink/opensearch/index/IndexConfiguration.java @@ -57,6 +57,7 @@ public class IndexConfiguration { public static final String DOCUMENT_ID = "document_id"; public static final String ROUTING_FIELD = "routing_field"; public static final String ROUTING = "routing"; + public static final String PIPELINE = "pipeline"; public static final String ISM_POLICY_FILE = "ism_policy_file"; public static final long DEFAULT_BULK_SIZE = 5L; public static final boolean DEFAULT_ESTIMATE_BULK_SIZE_USING_COMPRESSION = false; @@ -81,6 +82,7 @@ public class IndexConfiguration { private final Map indexTemplate; private final String documentIdField; private final String documentId; + private final String pipeline; private final String routingField; private final String routing; private final long bulkSize; @@ -147,6 +149,7 @@ private IndexConfiguration(final Builder builder) { this.flushTimeout = builder.flushTimeout; this.routingField = builder.routingField; this.routing = builder.routing; + this.pipeline = builder.pipeline; String documentIdField = builder.documentIdField; String documentId = builder.documentId; @@ -266,6 +269,11 @@ public static IndexConfiguration readIndexConfig(final PluginSetting pluginSetti builder = builder.withRouting(routing); } + final String pipeline = pluginSetting.getStringOrDefault(PIPELINE, null); + if (pipeline != null) { + builder = builder.withPipeline(pipeline); + } + final String ismPolicyFile = pluginSetting.getStringOrDefault(ISM_POLICY_FILE, null); builder = builder.withIsmPolicyFile(ismPolicyFile); @@ -336,6 +344,10 @@ public String getRouting() { return routing; } + public String getPipeline() { + return pipeline; + } + public long getBulkSize() { return bulkSize; } @@ -459,6 +471,7 @@ public static class Builder { private int numReplicas; private String routingField; private String routing; + private String pipeline; private String documentIdField; private String documentId; private long bulkSize = DEFAULT_BULK_SIZE; @@ -534,6 +547,11 @@ public Builder withRouting(final String routing) { return this; } + public Builder withPipeline(final String pipeline) { + this.pipeline = pipeline; + return this; + } + public Builder withBulkSize(final long bulkSize) { this.bulkSize = bulkSize; return this; diff --git a/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/BulkRetryStrategyTests.java b/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/BulkRetryStrategyTests.java index 09b78a1de8..cc05514502 100644 --- a/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/BulkRetryStrategyTests.java +++ b/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/BulkRetryStrategyTests.java @@ -717,7 +717,7 @@ private static BulkResponseItem customBulkFailureResponse(final RestStatus restS } private SerializedJson arbitraryDocument() { - return SerializedJson.fromStringAndOptionals("{}", null, null); + return SerializedJson.fromStringAndOptionals("{}", null, null, null); } private static class FakeClient { diff --git a/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/OpenSearchSinkTest.java b/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/OpenSearchSinkTest.java index 9b51954b62..31b77e0bf3 100644 --- a/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/OpenSearchSinkTest.java +++ b/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/OpenSearchSinkTest.java @@ -146,6 +146,7 @@ void setup() { when(indexConfiguration.getDocumentIdField()).thenReturn(null); when(indexConfiguration.getRoutingField()).thenReturn(null); when(indexConfiguration.getRouting()).thenReturn(null); + when(indexConfiguration.getPipeline()).thenReturn(null); when(indexConfiguration.getActions()).thenReturn(null); when(indexConfiguration.getDocumentRootKey()).thenReturn(null); lenient().when(indexConfiguration.getVersionType()).thenReturn(null); @@ -289,6 +290,22 @@ void test_routing_in_document() throws IOException { assertThat(objectUnderTest2.getDocument(event).getRoutingField(), equalTo(Optional.of(routingValue))); } + @Test + void test_pipeline_in_document() throws IOException { + String pipelineValue = UUID.randomUUID().toString(); + String pipelineKey = UUID.randomUUID().toString(); + final OpenSearchSink objectUnderTest = createObjectUnderTest(); + final Event event = JacksonEvent.builder() + .withEventType("event") + .withData(Collections.singletonMap(pipelineKey, pipelineValue)) + .build(); + assertThat(objectUnderTest.getDocument(event).getPipelineField(), equalTo(Optional.empty())); + + when(indexConfiguration.getPipeline()).thenReturn("${"+pipelineKey+"}"); + final OpenSearchSink objectUnderTest2 = createObjectUnderTest(); + assertThat(objectUnderTest2.getDocument(event).getPipelineField(), equalTo(Optional.of(pipelineValue))); + } + @Test void doOutput_with_invalid_version_expression_result_catches_RuntimeException_and_creates_DLQObject() throws IOException { diff --git a/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/bulk/Es6BulkApiWrapperTest.java b/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/bulk/Es6BulkApiWrapperTest.java index 558671f091..2ec15e7348 100644 --- a/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/bulk/Es6BulkApiWrapperTest.java +++ b/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/bulk/Es6BulkApiWrapperTest.java @@ -26,6 +26,7 @@ import java.io.IOException; import java.util.List; +import java.util.UUID; import java.util.stream.Stream; import static org.hamcrest.CoreMatchers.equalTo; @@ -122,6 +123,26 @@ void testBulk_when_request_index_missing(final boolean isIndex, final boolean is assertThat(endpoint.requestUrl(bulkRequest), equalTo(expectedURI)); } + @Test + void testBulkWithAdditionParameters() throws IOException { + final String requestIndex = "test-index"; + final String expectedURI = String.format(ES6_URI_PATTERN, "test-index"); + when(openSearchClient._transport()).thenReturn(openSearchTransport); + when(openSearchClient._transportOptions()).thenReturn(transportOptions); + when(bulkRequest.index()).thenReturn(requestIndex); + final String pipeline = UUID.randomUUID().toString(); + when(bulkRequest.pipeline()).thenReturn(pipeline); + objectUnderTest.bulk(bulkRequest); + + ArgumentCaptor bulkRequestArgumentCaptor = ArgumentCaptor.forClass(BulkRequest.class); + + verify(openSearchTransport).performRequest( + bulkRequestArgumentCaptor.capture(), jsonEndpointArgumentCaptor.capture(), eq(transportOptions)); + final JsonEndpoint endpoint = jsonEndpointArgumentCaptor.getValue(); + assertThat(endpoint.requestUrl(bulkRequest), equalTo(expectedURI)); + assertThat(bulkRequestArgumentCaptor.getValue().pipeline(), equalTo(pipeline)); + } + private static Stream getTypeFlags() { return Stream.of( Arguments.of(true, false, false, false), diff --git a/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/bulk/SerializedJsonImplTest.java b/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/bulk/SerializedJsonImplTest.java index ff52a332c1..04fea232c8 100644 --- a/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/bulk/SerializedJsonImplTest.java +++ b/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/bulk/SerializedJsonImplTest.java @@ -20,6 +20,7 @@ class SerializedJsonImplTest { private byte[] documentBytes; private String documentId; private String routingField; + private String pipelineField; @BeforeEach void setUp() { @@ -27,12 +28,13 @@ void setUp() { documentSize = random.nextInt(1_000) + 100; documentBytes = new byte[documentSize]; - documentId = RandomStringUtils.randomAlphabetic(10); - routingField = RandomStringUtils.randomAlphabetic(10); + documentId = RandomStringUtils.randomAlphabetic(10); + routingField = RandomStringUtils.randomAlphabetic(10); + pipelineField = RandomStringUtils.randomAlphabetic(10); } private SerializedJsonImpl createObjectUnderTest() { - return new SerializedJsonImpl(documentBytes, documentId, routingField); + return new SerializedJsonImpl(documentBytes, documentId, routingField, pipelineField); } @Test @@ -45,5 +47,6 @@ void getSerializedJson_returns_the_document_byte_array_and_fields() { assertThat(createObjectUnderTest().getSerializedJson(), sameInstance(documentBytes)); assertThat(createObjectUnderTest().getDocumentId().get(), equalTo(documentId)); assertThat(createObjectUnderTest().getRoutingField().get(), equalTo(routingField)); + assertThat(createObjectUnderTest().getPipelineField().get(), equalTo(pipelineField)); } } diff --git a/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/bulk/SerializedJsonNodeTest.java b/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/bulk/SerializedJsonNodeTest.java index 1131a13b5d..0716b83248 100644 --- a/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/bulk/SerializedJsonNodeTest.java +++ b/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/bulk/SerializedJsonNodeTest.java @@ -21,6 +21,7 @@ class SerializedJsonNodeTest { private byte[] documentBytes; private String documentId; private String routingField; + private String pipelineField; private JsonNode jsonNode; private SerializedJson document; private String jsonString; @@ -39,7 +40,8 @@ void setUp() { } documentId = RandomStringUtils.randomAlphabetic(10); routingField = RandomStringUtils.randomAlphabetic(10); - document = SerializedJson.fromStringAndOptionals(jsonString, documentId, routingField); + pipelineField = RandomStringUtils.randomAlphabetic(10); + document = SerializedJson.fromStringAndOptionals(jsonString, documentId, routingField, pipelineField); } private SerializedJsonNode createObjectUnderTest() { @@ -56,6 +58,7 @@ void getSerializedJson_returns_the_document_byte_array_and_fields() { assertThat(createObjectUnderTest().getSerializedJson(), equalTo(jsonString.getBytes())); assertThat(createObjectUnderTest().getDocumentId().get(), equalTo(documentId)); assertThat(createObjectUnderTest().getRoutingField().get(), equalTo(routingField)); + assertThat(createObjectUnderTest().getPipelineField().get(), equalTo(pipelineField)); } } diff --git a/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/bulk/SerializedJsonTest.java b/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/bulk/SerializedJsonTest.java index 836a1b1c4d..7735d117c0 100644 --- a/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/bulk/SerializedJsonTest.java +++ b/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/bulk/SerializedJsonTest.java @@ -18,22 +18,24 @@ class SerializedJsonTest { @Test void fromString_returns_SerializedJsonImpl() { - assertThat(SerializedJson.fromStringAndOptionals("{}", null, null), instanceOf(SerializedJsonImpl.class)); + assertThat(SerializedJson.fromStringAndOptionals("{}", null, null, null), instanceOf(SerializedJsonImpl.class)); } @Test void fromString_throws_if_the_jsonString_is_null() { - assertThrows(NullPointerException.class, () -> SerializedJson.fromStringAndOptionals(null, null, null)); + assertThrows(NullPointerException.class, () -> SerializedJson.fromStringAndOptionals(null, null, null, null)); } @Test void fromString_returns_SerializedJsonImpl_with_correctValues() { String documentId = RandomStringUtils.randomAlphabetic(10); String routingField = RandomStringUtils.randomAlphabetic(10); - SerializedJson serializedJson = SerializedJson.fromStringAndOptionals("{}", documentId, routingField); + String pipelineField = RandomStringUtils.randomAlphabetic(10); + SerializedJson serializedJson = SerializedJson.fromStringAndOptionals("{}", documentId, routingField, pipelineField); assertThat(serializedJson, instanceOf(SerializedJsonImpl.class)); assertThat(serializedJson.getDocumentId().get(), equalTo(documentId)); assertThat(serializedJson.getRoutingField().get(), equalTo(routingField)); + assertThat(serializedJson.getPipelineField().get(), equalTo(pipelineField)); assertThat(serializedJson.getSerializedJson(), equalTo("{}".getBytes())); } @@ -41,6 +43,7 @@ void fromString_returns_SerializedJsonImpl_with_correctValues() { void fromString_returns_SerializedJsonNode_with_correctValues() { String documentId = RandomStringUtils.randomAlphabetic(10); String routingField = RandomStringUtils.randomAlphabetic(10); + String pipelineField = RandomStringUtils.randomAlphabetic(10); final String jsonString = "{\"key\":\"value\"}"; JsonNode jsonNode; ObjectMapper objectMapper = new ObjectMapper(); @@ -49,11 +52,12 @@ void fromString_returns_SerializedJsonNode_with_correctValues() { } catch (Exception e) { jsonNode = null; } - SerializedJson document = SerializedJson.fromStringAndOptionals(jsonString, documentId, routingField); + SerializedJson document = SerializedJson.fromStringAndOptionals(jsonString, documentId, routingField, pipelineField); SerializedJson serializedJson = SerializedJson.fromJsonNode(jsonNode, document); assertThat(serializedJson, instanceOf(SerializedJsonNode.class)); assertThat(serializedJson.getDocumentId().get(), equalTo(documentId)); assertThat(serializedJson.getRoutingField().get(), equalTo(routingField)); + assertThat(serializedJson.getPipelineField().get(), equalTo(pipelineField)); assertThat(serializedJson.getSerializedJson(), equalTo(jsonString.getBytes())); } } diff --git a/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/dlq/FailedBulkOperationConverterTest.java b/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/dlq/FailedBulkOperationConverterTest.java index aedd2d304a..17f0e52079 100644 --- a/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/dlq/FailedBulkOperationConverterTest.java +++ b/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/dlq/FailedBulkOperationConverterTest.java @@ -160,7 +160,7 @@ private void generateRandomDocument() { final String jsonString = String.format("{\"%s\": \"%s\", \"%s\": \"%s\"}", key1, value1, key2, value2); - document = SerializedJson.fromStringAndOptionals(jsonString, null, null); + document = SerializedJson.fromStringAndOptionals(jsonString, null, null, null); } diff --git a/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/index/IndexConfigurationTests.java b/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/index/IndexConfigurationTests.java index 99251fb956..e14689e25d 100644 --- a/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/index/IndexConfigurationTests.java +++ b/data-prepper-plugins/opensearch/src/test/java/org/opensearch/dataprepper/plugins/sink/opensearch/index/IndexConfigurationTests.java @@ -52,6 +52,9 @@ import static org.opensearch.dataprepper.plugins.sink.opensearch.index.IndexConfiguration.DISTRIBUTION_VERSION; import static org.opensearch.dataprepper.plugins.sink.opensearch.index.IndexConfiguration.DOCUMENT_ROOT_KEY; import static org.opensearch.dataprepper.plugins.sink.opensearch.index.IndexConfiguration.DOCUMENT_VERSION_EXPRESSION; +import static org.opensearch.dataprepper.plugins.sink.opensearch.index.IndexConfiguration.PIPELINE; +import static org.opensearch.dataprepper.plugins.sink.opensearch.index.IndexConfiguration.ROUTING; +import static org.opensearch.dataprepper.plugins.sink.opensearch.index.IndexConfiguration.ROUTING_FIELD; import static org.opensearch.dataprepper.plugins.sink.opensearch.index.IndexConfiguration.SERVERLESS; import static org.opensearch.dataprepper.plugins.sink.opensearch.index.IndexConfiguration.TEMPLATE_TYPE; import static org.opensearch.dataprepper.plugins.sink.opensearch.index.IndexConstants.RAW_DEFAULT_TEMPLATE_FILE; @@ -476,6 +479,39 @@ public void testReadIndexConfig_emptyDocumentRootKey() { assertThrows(IllegalArgumentException.class, () -> IndexConfiguration.readIndexConfig(pluginSetting)); } + @Test + public void testReadIndexConfig_pipeline() { + final Map metadata = initializeConfigMetaData( + IndexType.CUSTOM.getValue(), "foo", null, null, null, null, null); + final String expectedPipelineValue = UUID.randomUUID().toString(); + metadata.put(PIPELINE, expectedPipelineValue); + final PluginSetting pluginSetting = getPluginSetting(metadata); + final IndexConfiguration indexConfiguration = IndexConfiguration.readIndexConfig(pluginSetting); + assertEquals(expectedPipelineValue, indexConfiguration.getPipeline()); + } + + @Test + public void testReadIndexConfig_routing() { + final Map metadata = initializeConfigMetaData( + IndexType.CUSTOM.getValue(), "foo", null, null, null, null, null); + final String expectedRoutingValue = UUID.randomUUID().toString(); + metadata.put(ROUTING, expectedRoutingValue); + final PluginSetting pluginSetting = getPluginSetting(metadata); + final IndexConfiguration indexConfiguration = IndexConfiguration.readIndexConfig(pluginSetting); + assertEquals(expectedRoutingValue, indexConfiguration.getRouting()); + } + + @Test + public void testReadIndexConfig_routingField() { + final Map metadata = initializeConfigMetaData( + IndexType.CUSTOM.getValue(), "foo", null, null, null, null, null); + final String expectedRoutingFieldValue = UUID.randomUUID().toString(); + metadata.put(ROUTING_FIELD, expectedRoutingFieldValue); + final PluginSetting pluginSetting = getPluginSetting(metadata); + final IndexConfiguration indexConfiguration = IndexConfiguration.readIndexConfig(pluginSetting); + assertEquals(expectedRoutingFieldValue, indexConfiguration.getRoutingField()); + } + @ParameterizedTest @ValueSource(strings = {"${key}", "${getMetadata(\"key\")}"}) public void testReadIndexConfig_withValidDocumentVersionExpression(final String versionExpression) { diff --git a/data-prepper-plugins/opensearch/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker b/data-prepper-plugins/opensearch/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker deleted file mode 100644 index 78ccc25012..0000000000 --- a/data-prepper-plugins/opensearch/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker +++ /dev/null @@ -1,3 +0,0 @@ -# To enable mocking of final classes with vanilla Mockito -# https://github.com/mockito/mockito/wiki/What%27s-new-in-Mockito-2#mock-the-unmockable-opt-in-mocking-of-final-classesmethods -mock-maker-inline \ No newline at end of file diff --git a/data-prepper-plugins/otel-logs-source/build.gradle b/data-prepper-plugins/otel-logs-source/build.gradle index 97901da8c3..822e945ba9 100644 --- a/data-prepper-plugins/otel-logs-source/build.gradle +++ b/data-prepper-plugins/otel-logs-source/build.gradle @@ -31,7 +31,6 @@ dependencies { implementation libs.bouncycastle.bcprov implementation libs.bouncycastle.bcpkix testImplementation 'org.assertj:assertj-core:3.25.3' - testImplementation testLibs.mockito.inline testImplementation libs.commons.io } diff --git a/data-prepper-plugins/otel-logs-source/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker b/data-prepper-plugins/otel-logs-source/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker deleted file mode 100644 index 78ccc25012..0000000000 --- a/data-prepper-plugins/otel-logs-source/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker +++ /dev/null @@ -1,3 +0,0 @@ -# To enable mocking of final classes with vanilla Mockito -# https://github.com/mockito/mockito/wiki/What%27s-new-in-Mockito-2#mock-the-unmockable-opt-in-mocking-of-final-classesmethods -mock-maker-inline \ No newline at end of file diff --git a/data-prepper-plugins/otel-metrics-raw-processor/build.gradle b/data-prepper-plugins/otel-metrics-raw-processor/build.gradle index af20b2e74b..a4316fca16 100644 --- a/data-prepper-plugins/otel-metrics-raw-processor/build.gradle +++ b/data-prepper-plugins/otel-metrics-raw-processor/build.gradle @@ -22,7 +22,6 @@ dependencies { implementation 'com.fasterxml.jackson.dataformat:jackson-dataformat-yaml' implementation libs.guava.core testImplementation 'org.assertj:assertj-core:3.25.3' - testImplementation testLibs.mockito.inline } jacocoTestCoverageVerification { diff --git a/data-prepper-plugins/otel-metrics-raw-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/otelmetrics/OtelMetricsRawProcessorConfig.java b/data-prepper-plugins/otel-metrics-raw-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/otelmetrics/OtelMetricsRawProcessorConfig.java index 9935cc9218..b71a0d1800 100644 --- a/data-prepper-plugins/otel-metrics-raw-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/otelmetrics/OtelMetricsRawProcessorConfig.java +++ b/data-prepper-plugins/otel-metrics-raw-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/otelmetrics/OtelMetricsRawProcessorConfig.java @@ -6,17 +6,23 @@ package org.opensearch.dataprepper.plugins.processor.otelmetrics; import static org.opensearch.dataprepper.plugins.otel.codec.OTelProtoCodec.DEFAULT_EXPONENTIAL_HISTOGRAM_MAX_ALLOWED_SCALE; + import com.fasterxml.jackson.annotation.JsonProperty; +import com.fasterxml.jackson.annotation.JsonPropertyDescription; public class OtelMetricsRawProcessorConfig { @JsonProperty("flatten_attributes") + @JsonPropertyDescription("Whether or not to flatten the `attributes` field in the JSON data.") boolean flattenAttributesFlag = true; + @JsonPropertyDescription("Whether or not to calculate histogram buckets.") private Boolean calculateHistogramBuckets = true; + @JsonPropertyDescription("Whether or not to calculate exponential histogram buckets.") private Boolean calculateExponentialHistogramBuckets = true; + @JsonPropertyDescription("Maximum allowed scale in exponential histogram calculation.") private Integer exponentialHistogramMaxAllowedScale = DEFAULT_EXPONENTIAL_HISTOGRAM_MAX_ALLOWED_SCALE; public Boolean getCalculateExponentialHistogramBuckets() { diff --git a/data-prepper-plugins/otel-metrics-source/build.gradle b/data-prepper-plugins/otel-metrics-source/build.gradle index 25ea578566..96d250d67d 100644 --- a/data-prepper-plugins/otel-metrics-source/build.gradle +++ b/data-prepper-plugins/otel-metrics-source/build.gradle @@ -31,7 +31,6 @@ dependencies { implementation libs.bouncycastle.bcprov implementation libs.bouncycastle.bcpkix testImplementation 'org.assertj:assertj-core:3.25.3' - testImplementation testLibs.mockito.inline testImplementation libs.commons.io } diff --git a/data-prepper-plugins/otel-metrics-source/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker b/data-prepper-plugins/otel-metrics-source/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker deleted file mode 100644 index 78ccc25012..0000000000 --- a/data-prepper-plugins/otel-metrics-source/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker +++ /dev/null @@ -1,3 +0,0 @@ -# To enable mocking of final classes with vanilla Mockito -# https://github.com/mockito/mockito/wiki/What%27s-new-in-Mockito-2#mock-the-unmockable-opt-in-mocking-of-final-classesmethods -mock-maker-inline \ No newline at end of file diff --git a/data-prepper-plugins/otel-trace-group-processor/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker b/data-prepper-plugins/otel-trace-group-processor/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker deleted file mode 100644 index 78ccc25012..0000000000 --- a/data-prepper-plugins/otel-trace-group-processor/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker +++ /dev/null @@ -1,3 +0,0 @@ -# To enable mocking of final classes with vanilla Mockito -# https://github.com/mockito/mockito/wiki/What%27s-new-in-Mockito-2#mock-the-unmockable-opt-in-mocking-of-final-classesmethods -mock-maker-inline \ No newline at end of file diff --git a/data-prepper-plugins/otel-trace-raw-processor/build.gradle b/data-prepper-plugins/otel-trace-raw-processor/build.gradle index 6d9994abbb..2df90630d8 100644 --- a/data-prepper-plugins/otel-trace-raw-processor/build.gradle +++ b/data-prepper-plugins/otel-trace-raw-processor/build.gradle @@ -18,9 +18,8 @@ dependencies { implementation libs.armeria.grpc implementation 'com.fasterxml.jackson.core:jackson-databind' implementation 'com.fasterxml.jackson.dataformat:jackson-dataformat-yaml' - implementation 'com.github.ben-manes.caffeine:caffeine:3.1.8' + implementation libs.caffeine testImplementation 'org.assertj:assertj-core:3.25.3' - testImplementation testLibs.mockito.inline } jacocoTestCoverageVerification { diff --git a/data-prepper-plugins/otel-trace-raw-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/oteltrace/OtelTraceRawProcessorConfig.java b/data-prepper-plugins/otel-trace-raw-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/oteltrace/OtelTraceRawProcessorConfig.java index 553e1ed2d1..6b850f7354 100644 --- a/data-prepper-plugins/otel-trace-raw-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/oteltrace/OtelTraceRawProcessorConfig.java +++ b/data-prepper-plugins/otel-trace-raw-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/oteltrace/OtelTraceRawProcessorConfig.java @@ -6,6 +6,7 @@ package org.opensearch.dataprepper.plugins.processor.oteltrace; import com.fasterxml.jackson.annotation.JsonProperty; +import com.fasterxml.jackson.annotation.JsonPropertyDescription; import java.time.Duration; @@ -14,12 +15,17 @@ public class OtelTraceRawProcessorConfig { static final Duration DEFAULT_TRACE_ID_TTL = Duration.ofSeconds(15L); static final long MAX_TRACE_ID_CACHE_SIZE = 1_000_000L; @JsonProperty("trace_flush_interval") + @JsonPropertyDescription("Represents the time interval in seconds to flush all the descendant spans without any " + + "root span. Default is 180.") private long traceFlushInterval = DEFAULT_TG_FLUSH_INTERVAL_SEC; @JsonProperty("trace_group_cache_ttl") + @JsonPropertyDescription("Represents the time-to-live to cache a trace group details. Default is 15 seconds.") private Duration traceGroupCacheTimeToLive = DEFAULT_TRACE_ID_TTL; @JsonProperty("trace_group_cache_max_size") + @JsonPropertyDescription("Represents the maximum size of the cache to store the trace group details from root spans. " + + "Default is 1000000.") private long traceGroupCacheMaxSize = MAX_TRACE_ID_CACHE_SIZE; public long getTraceFlushIntervalSeconds() { diff --git a/data-prepper-plugins/otel-trace-source/build.gradle b/data-prepper-plugins/otel-trace-source/build.gradle index 39c0869851..d1dcdfa12a 100644 --- a/data-prepper-plugins/otel-trace-source/build.gradle +++ b/data-prepper-plugins/otel-trace-source/build.gradle @@ -29,7 +29,6 @@ dependencies { implementation libs.bouncycastle.bcprov implementation libs.bouncycastle.bcpkix testImplementation 'org.assertj:assertj-core:3.25.3' - testImplementation testLibs.mockito.inline testImplementation testLibs.slf4j.simple testImplementation libs.commons.io } diff --git a/data-prepper-plugins/otel-trace-source/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker b/data-prepper-plugins/otel-trace-source/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker deleted file mode 100644 index 78ccc25012..0000000000 --- a/data-prepper-plugins/otel-trace-source/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker +++ /dev/null @@ -1,3 +0,0 @@ -# To enable mocking of final classes with vanilla Mockito -# https://github.com/mockito/mockito/wiki/What%27s-new-in-Mockito-2#mock-the-unmockable-opt-in-mocking-of-final-classesmethods -mock-maker-inline \ No newline at end of file diff --git a/data-prepper-plugins/parquet-codecs/build.gradle b/data-prepper-plugins/parquet-codecs/build.gradle index ea783c53d4..c402fb6741 100644 --- a/data-prepper-plugins/parquet-codecs/build.gradle +++ b/data-prepper-plugins/parquet-codecs/build.gradle @@ -7,16 +7,28 @@ dependencies { implementation project(':data-prepper-api') implementation project(':data-prepper-plugins:common') implementation libs.avro.core - implementation libs.hadoop.common - implementation(libs.hadoop.mapreduce) { + implementation 'org.apache.commons:commons-text:1.11.0' + implementation libs.parquet.avro + implementation libs.parquet.column + implementation libs.parquet.common + implementation libs.parquet.hadoop + runtimeOnly(libs.hadoop.common) { + exclude group: 'org.eclipse.jetty' + exclude group: 'org.apache.hadoop', module: 'hadoop-auth' + exclude group: 'org.apache.zookeeper', module: 'zookeeper' + } + runtimeOnly(libs.hadoop.mapreduce) { + exclude group: 'org.eclipse.jetty' exclude group: 'org.apache.hadoop', module: 'hadoop-hdfs-client' + exclude group: 'org.apache.zookeeper', module: 'zookeeper' } - implementation 'org.apache.parquet:parquet-avro:1.14.0' - implementation 'org.apache.parquet:parquet-column:1.14.0' - implementation 'org.apache.parquet:parquet-common:1.14.0' - implementation 'org.apache.parquet:parquet-hadoop:1.14.0' testImplementation project(':data-prepper-test-common') testImplementation project(':data-prepper-test-event') + testImplementation(libs.hadoop.common) { + exclude group: 'org.eclipse.jetty' + exclude group: 'org.apache.hadoop', module: 'hadoop-auth' + exclude group: 'org.apache.zookeeper', module: 'zookeeper' + } constraints { implementation('com.nimbusds:nimbus-jose-jwt') { diff --git a/data-prepper-plugins/parquet-codecs/src/main/java/org/opensearch/dataprepper/plugins/codec/parquet/ParquetInputCodec.java b/data-prepper-plugins/parquet-codecs/src/main/java/org/opensearch/dataprepper/plugins/codec/parquet/ParquetInputCodec.java index fa9876f114..e85e0c9926 100644 --- a/data-prepper-plugins/parquet-codecs/src/main/java/org/opensearch/dataprepper/plugins/codec/parquet/ParquetInputCodec.java +++ b/data-prepper-plugins/parquet-codecs/src/main/java/org/opensearch/dataprepper/plugins/codec/parquet/ParquetInputCodec.java @@ -6,8 +6,9 @@ package org.opensearch.dataprepper.plugins.codec.parquet; import org.apache.avro.generic.GenericRecord; -import org.apache.hadoop.conf.Configuration; import org.apache.parquet.avro.AvroParquetReader; +import org.apache.parquet.conf.ParquetConfiguration; +import org.apache.parquet.conf.PlainParquetConfiguration; import org.apache.parquet.hadoop.ParquetReader; import org.opensearch.dataprepper.model.annotations.DataPrepperPlugin; import org.opensearch.dataprepper.model.annotations.DataPrepperPluginConstructor; @@ -46,13 +47,13 @@ public class ParquetInputCodec implements InputCodec { private static final Logger LOG = LoggerFactory.getLogger(ParquetInputCodec.class); - private final Configuration configuration; + private final ParquetConfiguration configuration; private final EventFactory eventFactory; @DataPrepperPluginConstructor public ParquetInputCodec(final EventFactory eventFactory) { this.eventFactory = eventFactory; - configuration = new Configuration(); + configuration = new PlainParquetConfiguration(); configuration.setBoolean(READ_INT96_AS_FIXED, true); } @@ -80,8 +81,7 @@ public void parse(final InputFile inputFile, final DecompressionEngine decompres } private void parseParquetFile(final InputFile inputFile, final Consumer> eventConsumer) throws IOException { - try (ParquetReader reader = AvroParquetReader.builder(inputFile) - .withConf(this.configuration) + try (ParquetReader reader = AvroParquetReader.builder(inputFile, this.configuration) .build()) { GenericRecordJsonEncoder encoder = new GenericRecordJsonEncoder(); GenericRecord record = null; diff --git a/data-prepper-plugins/parquet-codecs/src/test/java/org/opensearch/dataprepper/plugins/codec/parquet/ParquetInputCodecTest.java b/data-prepper-plugins/parquet-codecs/src/test/java/org/opensearch/dataprepper/plugins/codec/parquet/ParquetInputCodecTest.java index 1510ad75cc..5ae5f82d0d 100644 --- a/data-prepper-plugins/parquet-codecs/src/test/java/org/opensearch/dataprepper/plugins/codec/parquet/ParquetInputCodecTest.java +++ b/data-prepper-plugins/parquet-codecs/src/test/java/org/opensearch/dataprepper/plugins/codec/parquet/ParquetInputCodecTest.java @@ -8,8 +8,17 @@ import org.apache.avro.generic.GenericData; import org.apache.avro.generic.GenericRecord; import org.apache.parquet.avro.AvroParquetWriter; +import org.apache.parquet.conf.PlainParquetConfiguration; +import org.apache.parquet.example.data.Group; +import org.apache.parquet.example.data.simple.NanoTime; +import org.apache.parquet.example.data.simple.SimpleGroup; import org.apache.parquet.hadoop.ParquetReader; import org.apache.parquet.hadoop.ParquetWriter; +import org.apache.parquet.hadoop.example.ExampleParquetWriter; +import org.apache.parquet.hadoop.example.GroupWriteSupport; +import org.apache.parquet.schema.MessageType; +import org.apache.parquet.schema.PrimitiveType; +import org.apache.parquet.schema.Type; import org.junit.jupiter.api.BeforeAll; import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Test; @@ -33,12 +42,15 @@ import java.net.URISyntaxException; import java.net.URL; import java.nio.file.Paths; +import java.time.OffsetDateTime; +import java.time.temporal.JulianFields; import java.util.Arrays; import java.util.Collections; import java.util.List; import java.util.Map; import java.util.function.Consumer; +import static org.apache.parquet.avro.AvroWriteSupport.WRITE_FIXED_AS_INT96; import static org.hamcrest.CoreMatchers.notNullValue; import static org.hamcrest.CoreMatchers.startsWith; import static org.hamcrest.MatcherAssert.assertThat; @@ -161,6 +173,22 @@ public void parseInputFile_parsesCorrectly() throws IOException { assertRecordsCorrect(actualRecords); } + @Test + public void parseInputStream_parsesCorrectly_with_int96() throws IOException { + final File testDataFile = File.createTempFile(FILE_PREFIX + "-int96-", FILE_SUFFIX); + testDataFile.deleteOnExit(); + generateTestDataInt96(testDataFile); + InputStream targetStream = new FileInputStream(testDataFile); + + parquetInputCodec.parse(targetStream, mockConsumer); + + final ArgumentCaptor> recordArgumentCaptor = ArgumentCaptor.forClass(Record.class); + verify(mockConsumer, times(10)).accept(recordArgumentCaptor.capture()); + + final List> actualRecords = recordArgumentCaptor.getAllValues(); + assertThat(actualRecords.size(), equalTo(10)); + } + @Test public void parseInputFile_snappyInputFile() throws IOException, URISyntaxException { URL resource = getClass().getClassLoader().getResource("sample.snappy.parquet"); @@ -203,8 +231,10 @@ public void parseInputFile_testParquetFile() throws IOException, URISyntaxExcept private static void generateTestData(final File file) throws IOException { Schema schema = new Schema.Parser().parse(SCHEMA_JSON); - ParquetWriter writer = AvroParquetWriter.builder(new LocalOutputFile(file)) + final ParquetWriter writer = AvroParquetWriter.builder(new LocalOutputFile(file)) .withSchema(schema) + .withConf(new PlainParquetConfiguration()) + .withEncryption(null) .build(); for (int i = 0; i < 10; i++) { @@ -220,6 +250,34 @@ private static void generateTestData(final File file) throws IOException { writer.close(); } + /** + * Generates a Parquet file with INT96 data. This must use the example + * schema rather than Avro, or it would not correctly reproduce possible INT96 + * error. + * + * @param file The file for Parquet + */ + private static void generateTestDataInt96(final File file) throws IOException { + final MessageType schema = new MessageType("test", List.of( + new PrimitiveType(Type.Repetition.OPTIONAL, PrimitiveType.PrimitiveTypeName.INT96, "my_timestamp_value") + )); + final PlainParquetConfiguration conf = new PlainParquetConfiguration(); + conf.setStrings(WRITE_FIXED_AS_INT96, "my_timestamp_value"); + conf.set(GroupWriteSupport.PARQUET_EXAMPLE_SCHEMA, schema.toString()); + final ParquetWriter writer = ExampleParquetWriter.builder(new LocalOutputFile(file)) + .withConf(conf) + .withEncryption(null) + .build(); + + for (int i = 0; i < 10; i++) { + final Group group = new SimpleGroup(schema); + group.add("my_timestamp_value", createInt96()); + + writer.write(group); + } + writer.close(); + } + private void assertRecordsCorrect(final List> records) { assertThat(records.size(), equalTo(10)); for (int i = 0; i < 10; i++) { @@ -240,5 +298,9 @@ private void assertRecordsCorrect(final List> records) { assertThat(record.getData().getMetadata().getEventType(), equalTo(EVENT_TYPE)); } } + + private static NanoTime createInt96() { + return new NanoTime((int) OffsetDateTime.now().getLong(JulianFields.JULIAN_DAY), System.nanoTime()); + } } diff --git a/data-prepper-plugins/parquet-codecs/src/main/resources/sample.snappy.parquet b/data-prepper-plugins/parquet-codecs/src/test/resources/sample.snappy.parquet similarity index 100% rename from data-prepper-plugins/parquet-codecs/src/main/resources/sample.snappy.parquet rename to data-prepper-plugins/parquet-codecs/src/test/resources/sample.snappy.parquet diff --git a/data-prepper-plugins/parquet-codecs/src/main/resources/test-parquet.parquet b/data-prepper-plugins/parquet-codecs/src/test/resources/test-parquet.parquet similarity index 100% rename from data-prepper-plugins/parquet-codecs/src/main/resources/test-parquet.parquet rename to data-prepper-plugins/parquet-codecs/src/test/resources/test-parquet.parquet diff --git a/data-prepper-plugins/parse-json-processor/build.gradle b/data-prepper-plugins/parse-json-processor/build.gradle index 44959173ba..488dbf7d86 100644 --- a/data-prepper-plugins/parse-json-processor/build.gradle +++ b/data-prepper-plugins/parse-json-processor/build.gradle @@ -13,7 +13,7 @@ dependencies { implementation 'com.fasterxml.jackson.core:jackson-databind' implementation 'com.fasterxml.jackson.dataformat:jackson-dataformat-ion' implementation 'com.fasterxml.jackson.dataformat:jackson-dataformat-xml' - implementation 'org.apache.parquet:parquet-common:1.14.0' + implementation libs.parquet.common testImplementation project(':data-prepper-test-common') testImplementation project(':data-prepper-test-event') } diff --git a/data-prepper-plugins/parse-json-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/parse/AbstractParseProcessor.java b/data-prepper-plugins/parse-json-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/parse/AbstractParseProcessor.java index a2b984d070..878316c183 100644 --- a/data-prepper-plugins/parse-json-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/parse/AbstractParseProcessor.java +++ b/data-prepper-plugins/parse-json-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/parse/AbstractParseProcessor.java @@ -36,6 +36,7 @@ public abstract class AbstractParseProcessor extends AbstractProcessor tagsOnFailure; private final boolean overwriteIfDestinationExists; + private final boolean deleteSourceRequested; private final ExpressionEvaluator expressionEvaluator; @@ -50,6 +51,7 @@ protected AbstractParseProcessor(PluginMetrics pluginMetrics, parseWhen = commonParseConfig.getParseWhen(); tagsOnFailure = commonParseConfig.getTagsOnFailure(); overwriteIfDestinationExists = commonParseConfig.getOverwriteIfDestinationExists(); + deleteSourceRequested = commonParseConfig.isDeleteSourceRequested(); this.expressionEvaluator = expressionEvaluator; } @@ -93,6 +95,10 @@ public Collection> doExecute(final Collection> recor } else if (overwriteIfDestinationExists || !event.containsKey(destination)) { event.put(destination, parsedValue); } + + if(deleteSourceRequested) { + event.delete(this.source); + } } catch (Exception e) { LOG.error(EVENT, "An exception occurred while using the {} processor on Event [{}]", getProcessorName(), record.getData(), e); } diff --git a/data-prepper-plugins/parse-json-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/parse/CommonParseConfig.java b/data-prepper-plugins/parse-json-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/parse/CommonParseConfig.java index 193631bea9..5fd5050b3d 100644 --- a/data-prepper-plugins/parse-json-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/parse/CommonParseConfig.java +++ b/data-prepper-plugins/parse-json-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/parse/CommonParseConfig.java @@ -27,7 +27,6 @@ public interface CommonParseConfig { * An optional setting used to specify a JSON Pointer. Pointer points to the JSON key that will be parsed into the destination. * There is no pointer by default, meaning that the entirety of source will be parsed. If the target key would overwrite an existing * key in the Event then the absolute path of the target key will be placed into destination - * * Note: (should this be configurable/what about double conflicts?) * @return String representing JSON Pointer */ @@ -54,4 +53,10 @@ public interface CommonParseConfig { * Defaults to true. */ boolean getOverwriteIfDestinationExists(); + + /** + * An optional setting used to request dropping the original raw message after successfully parsing the input event. + * Defaults to false. + */ + boolean isDeleteSourceRequested(); } diff --git a/data-prepper-plugins/parse-json-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/parse/ion/ParseIonProcessorConfig.java b/data-prepper-plugins/parse-json-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/parse/ion/ParseIonProcessorConfig.java index 67a2f464ad..fcc2950477 100644 --- a/data-prepper-plugins/parse-json-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/parse/ion/ParseIonProcessorConfig.java +++ b/data-prepper-plugins/parse-json-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/parse/ion/ParseIonProcessorConfig.java @@ -35,6 +35,9 @@ public class ParseIonProcessorConfig implements CommonParseConfig { @JsonProperty("overwrite_if_destination_exists") private boolean overwriteIfDestinationExists = true; + @JsonProperty + private boolean deleteSource = false; + @Override public String getSource() { return source; @@ -68,6 +71,11 @@ boolean isValidDestination() { if (Objects.isNull(destination)) return true; final String trimmedDestination = destination.trim(); - return trimmedDestination.length() != 0 && !(trimmedDestination.equals("/")); + return !trimmedDestination.isEmpty() && !(trimmedDestination.equals("/")); + } + + @Override + public boolean isDeleteSourceRequested() { + return deleteSource; } } diff --git a/data-prepper-plugins/parse-json-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/parse/json/ParseJsonProcessorConfig.java b/data-prepper-plugins/parse-json-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/parse/json/ParseJsonProcessorConfig.java index e0a2e91c1d..49ff2a5969 100644 --- a/data-prepper-plugins/parse-json-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/parse/json/ParseJsonProcessorConfig.java +++ b/data-prepper-plugins/parse-json-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/parse/json/ParseJsonProcessorConfig.java @@ -35,6 +35,9 @@ public class ParseJsonProcessorConfig implements CommonParseConfig { @JsonProperty("overwrite_if_destination_exists") private boolean overwriteIfDestinationExists = true; + @JsonProperty + private boolean deleteSource = false; + @Override public String getSource() { return source; @@ -63,11 +66,16 @@ public boolean getOverwriteIfDestinationExists() { return overwriteIfDestinationExists; } + @Override + public boolean isDeleteSourceRequested() { + return deleteSource; + } + @AssertTrue(message = "destination cannot be empty, whitespace, or a front slash (/)") boolean isValidDestination() { if (Objects.isNull(destination)) return true; final String trimmedDestination = destination.trim(); - return trimmedDestination.length() != 0 && !(trimmedDestination.equals("/")); + return !trimmedDestination.isEmpty() && !(trimmedDestination.equals("/")); } } diff --git a/data-prepper-plugins/parse-json-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/parse/xml/ParseXmlProcessorConfig.java b/data-prepper-plugins/parse-json-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/parse/xml/ParseXmlProcessorConfig.java index df4fabc397..c90173dc43 100644 --- a/data-prepper-plugins/parse-json-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/parse/xml/ParseXmlProcessorConfig.java +++ b/data-prepper-plugins/parse-json-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/parse/xml/ParseXmlProcessorConfig.java @@ -30,6 +30,9 @@ public class ParseXmlProcessorConfig implements CommonParseConfig { @JsonProperty("overwrite_if_destination_exists") private boolean overwriteIfDestinationExists = true; + @JsonProperty + private boolean deleteSource = false; + @Override public String getSource() { return source; @@ -65,6 +68,11 @@ boolean isValidDestination() { if (Objects.isNull(destination)) return true; final String trimmedDestination = destination.trim(); - return trimmedDestination.length() != 0 && !(trimmedDestination.equals("/")); + return !trimmedDestination.isEmpty() && !(trimmedDestination.equals("/")); + } + + @Override + public boolean isDeleteSourceRequested() { + return deleteSource; } } diff --git a/data-prepper-plugins/parse-json-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/parse/ion/ParseIonProcessorConfigTest.java b/data-prepper-plugins/parse-json-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/parse/ion/ParseIonProcessorConfigTest.java index 0fb274ba13..8c47650c05 100644 --- a/data-prepper-plugins/parse-json-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/parse/ion/ParseIonProcessorConfigTest.java +++ b/data-prepper-plugins/parse-json-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/parse/ion/ParseIonProcessorConfigTest.java @@ -57,6 +57,9 @@ void test_when_destinationIsWhiteSpaceOrFrontSlash_then_isValidDestinationFalse( setField(ParseIonProcessorConfig.class, config, "tagsOnFailure", tagsList); assertThat(config.getTagsOnFailure(), equalTo(tagsList)); + + setField(ParseIonProcessorConfig.class, config, "deleteSource", true); + assertThat(config.isDeleteSourceRequested(), equalTo(true)); } } } diff --git a/data-prepper-plugins/parse-json-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/parse/ion/ParseIonProcessorTest.java b/data-prepper-plugins/parse-json-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/parse/ion/ParseIonProcessorTest.java index 62873866d7..c9a8fdf4e5 100644 --- a/data-prepper-plugins/parse-json-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/parse/ion/ParseIonProcessorTest.java +++ b/data-prepper-plugins/parse-json-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/parse/ion/ParseIonProcessorTest.java @@ -47,6 +47,23 @@ void test_when_using_ion_features_then_processorParsesCorrectly() { final String serializedMessage = "{bareKey: 1, symbol: SYMBOL, timestamp: 2023-11-30T21:05:23.383Z, attribute: dollars::100.0 }"; final Event parsedEvent = createAndParseMessageEvent(serializedMessage); + assertThat(parsedEvent.containsKey(processorConfig.getSource()), equalTo(true)); + assertThat(parsedEvent.get(processorConfig.getSource(), Object.class), equalTo(serializedMessage)); + assertThat(parsedEvent.get("bareKey", Integer.class), equalTo(1)); + assertThat(parsedEvent.get("symbol", String.class), equalTo("SYMBOL")); + assertThat(parsedEvent.get("timestamp", String.class), equalTo("2023-11-30T21:05:23.383Z")); + assertThat(parsedEvent.get("attribute", Double.class), equalTo(100.0)); + } + + @Test + void test_when_deleteSourceFlagEnabled() { + when(processorConfig.isDeleteSourceRequested()).thenReturn(true); + parseJsonProcessor = new ParseIonProcessor(pluginMetrics, ionProcessorConfig, expressionEvaluator); + + final String serializedMessage = "{bareKey: 1, symbol: SYMBOL, timestamp: 2023-11-30T21:05:23.383Z, attribute: dollars::100.0 }"; + final Event parsedEvent = createAndParseMessageEvent(serializedMessage); + + assertThat(parsedEvent.containsKey(processorConfig.getSource()), equalTo(false)); assertThat(parsedEvent.get("bareKey", Integer.class), equalTo(1)); assertThat(parsedEvent.get("symbol", String.class), equalTo("SYMBOL")); assertThat(parsedEvent.get("timestamp", String.class), equalTo("2023-11-30T21:05:23.383Z")); diff --git a/data-prepper-plugins/parse-json-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/parse/json/ParseJsonProcessorConfigTest.java b/data-prepper-plugins/parse-json-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/parse/json/ParseJsonProcessorConfigTest.java index 459fab6ea5..aa138a0e7e 100644 --- a/data-prepper-plugins/parse-json-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/parse/json/ParseJsonProcessorConfigTest.java +++ b/data-prepper-plugins/parse-json-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/parse/json/ParseJsonProcessorConfigTest.java @@ -29,6 +29,7 @@ public void test_when_defaultParseJsonProcessorConfig_then_returns_default_value assertThat(objectUnderTest.getPointer(), equalTo(null)); assertThat(objectUnderTest.getTagsOnFailure(), equalTo(null)); assertThat(objectUnderTest.getOverwriteIfDestinationExists(), equalTo(true)); + assertThat(objectUnderTest.isDeleteSourceRequested(), equalTo(false)); } @Nested @@ -57,6 +58,9 @@ void test_when_destinationIsWhiteSpaceOrFrontSlash_then_isValidDestinationFalse( setField(ParseJsonProcessorConfig.class, config, "tagsOnFailure", tagsList); assertThat(config.getTagsOnFailure(), equalTo(tagsList)); + + setField(ParseJsonProcessorConfig.class, config, "deleteSource", true); + assertThat(config.isDeleteSourceRequested(), equalTo(true)); } } } diff --git a/data-prepper-plugins/parse-json-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/parse/json/ParseJsonProcessorTest.java b/data-prepper-plugins/parse-json-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/parse/json/ParseJsonProcessorTest.java index 4594cbe2f5..1416d6cf35 100644 --- a/data-prepper-plugins/parse-json-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/parse/json/ParseJsonProcessorTest.java +++ b/data-prepper-plugins/parse-json-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/parse/json/ParseJsonProcessorTest.java @@ -194,6 +194,22 @@ void test_when_nestedJSONArray_then_parsedIntoArrayAndIndicesAccessible() { assertThat(parsedEvent.get(pointerToFirstElement, String.class), equalTo(value.get(0))); } + @Test + void test_when_deleteSourceFlagEnabled() { + when(processorConfig.isDeleteSourceRequested()).thenReturn(true); + parseJsonProcessor = new ParseJsonProcessor(pluginMetrics, jsonProcessorConfig, expressionEvaluator); + + final String key = "key"; + final ArrayList value = new ArrayList<>(List.of("Element0","Element1","Element2")); + final String jsonArray = "{\"key\":[\"Element0\",\"Element1\",\"Element2\"]}"; + final Event parsedEvent = createAndParseMessageEvent(jsonArray); + + assertThat(parsedEvent.containsKey(processorConfig.getSource()), equalTo(false)); + assertThat(parsedEvent.get(key, ArrayList.class), equalTo(value)); + final String pointerToFirstElement = key + "/0"; + assertThat(parsedEvent.get(pointerToFirstElement, String.class), equalTo(value.get(0))); + } + @Test void test_when_nestedJSONArrayOfJSON_then_parsedIntoArrayAndIndicesAccessible() { parseJsonProcessor = createObjectUnderTest(); @@ -373,23 +389,21 @@ private String constructDeeplyNestedJsonPointer(final int numberOfLayers) { /** * Naive serialization that converts every = to : and wraps every word with double quotes (no error handling or input validation). - * @param messageMap - * @return + * @param messageMap source key value map + * @return serialized string representation of the map */ private String convertMapToJSONString(final Map messageMap) { final String replaceEquals = messageMap.toString().replace("=",":"); - final String addQuotes = replaceEquals.replaceAll("(\\w+)", "\"$1\""); // wrap every word in quotes - return addQuotes; + return replaceEquals.replaceAll("(\\w+)", "\"$1\""); } /** * Creates a Map that maps a single key to a value nested numberOfLayers layers deep. - * @param numberOfLayers - * @return + * @param numberOfLayers indicates the depth of layers count + * @return a Map representing the nested structure */ private Map constructArbitrarilyDeepJsonMap(final int numberOfLayers) { - final Map result = Collections.singletonMap(DEEPLY_NESTED_KEY_NAME,deepJsonMapHelper(0,numberOfLayers)); - return result; + return Collections.singletonMap(DEEPLY_NESTED_KEY_NAME,deepJsonMapHelper(0,numberOfLayers)); } private Object deepJsonMapHelper(final int currentLayer, final int numberOfLayers) { diff --git a/data-prepper-plugins/parse-json-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/parse/xml/ParseXmlProcessorConfigTest.java b/data-prepper-plugins/parse-json-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/parse/xml/ParseXmlProcessorConfigTest.java index d5e7e1ec43..bab6d6e919 100644 --- a/data-prepper-plugins/parse-json-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/parse/xml/ParseXmlProcessorConfigTest.java +++ b/data-prepper-plugins/parse-json-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/parse/xml/ParseXmlProcessorConfigTest.java @@ -52,6 +52,9 @@ void test_when_destinationIsWhiteSpaceOrFrontSlash_then_isValidDestinationFalse( setField(ParseXmlProcessorConfig.class, config, "tagsOnFailure", tagsList); assertThat(config.getTagsOnFailure(), equalTo(tagsList)); + + setField(ParseXmlProcessorConfig.class, config, "deleteSource", true); + assertThat(config.isDeleteSourceRequested(), equalTo(true)); } } } diff --git a/data-prepper-plugins/parse-json-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/parse/xml/ParseXmlProcessorTest.java b/data-prepper-plugins/parse-json-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/parse/xml/ParseXmlProcessorTest.java index 51de35ca70..8d9bc4cde3 100644 --- a/data-prepper-plugins/parse-json-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/parse/xml/ParseXmlProcessorTest.java +++ b/data-prepper-plugins/parse-json-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/parse/xml/ParseXmlProcessorTest.java @@ -60,6 +60,22 @@ void test_when_using_xml_features_then_processorParsesCorrectly() { assertThat(parsedEvent.get("age", String.class), equalTo("30")); } + @Test + void test_when_deleteSourceFlagEnabled() { + + final String tagOnFailure = UUID.randomUUID().toString(); + when(processorConfig.getTagsOnFailure()).thenReturn(List.of(tagOnFailure)); + when(processorConfig.isDeleteSourceRequested()).thenReturn(true); + + parseXmlProcessor = createObjectUnderTest(); + + final String serializedMessage = "John Doe30"; + final Event parsedEvent = createAndParseMessageEvent(serializedMessage); + assertThat(parsedEvent.containsKey(processorConfig.getSource()), equalTo(false)); + assertThat(parsedEvent.get("name", String.class), equalTo("John Doe")); + assertThat(parsedEvent.get("age", String.class), equalTo("30")); + } + @Test void test_when_using_invalid_xml_tags_correctly() { diff --git a/data-prepper-plugins/prometheus-sink/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker b/data-prepper-plugins/prometheus-sink/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker deleted file mode 100644 index 23c33feb6d..0000000000 --- a/data-prepper-plugins/prometheus-sink/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker +++ /dev/null @@ -1,3 +0,0 @@ -# To enable mocking of final classes with vanilla Mockito -# https://github.com/mockito/mockito/wiki/What%27s-new-in-Mockito-2#mock-the-unmockable-opt-in-mocking-of-final-classesmethods -mock-maker-inline diff --git a/data-prepper-plugins/rds-source/build.gradle b/data-prepper-plugins/rds-source/build.gradle index 8372276564..f83b1332eb 100644 --- a/data-prepper-plugins/rds-source/build.gradle +++ b/data-prepper-plugins/rds-source/build.gradle @@ -8,6 +8,7 @@ dependencies { implementation project(path: ':data-prepper-plugins:buffer-common') implementation project(path: ':data-prepper-plugins:http-common') implementation project(path: ':data-prepper-plugins:common') + implementation project(path: ':data-prepper-plugins:parquet-codecs') implementation 'io.micrometer:micrometer-core' @@ -20,7 +21,7 @@ dependencies { implementation 'com.fasterxml.jackson.core:jackson-core' implementation 'com.fasterxml.jackson.core:jackson-databind' - testImplementation testLibs.mockito.inline testImplementation project(path: ':data-prepper-test-common') testImplementation 'com.fasterxml.jackson.dataformat:jackson-dataformat-yaml' + testImplementation project(path: ':data-prepper-test-event') } diff --git a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/ClientFactory.java b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/ClientFactory.java index 9cdb2bfa50..7831754f0f 100644 --- a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/ClientFactory.java +++ b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/ClientFactory.java @@ -10,6 +10,7 @@ import org.opensearch.dataprepper.plugins.source.rds.configuration.AwsAuthenticationConfig; import software.amazon.awssdk.auth.credentials.AwsCredentialsProvider; import software.amazon.awssdk.services.rds.RdsClient; +import software.amazon.awssdk.services.s3.S3Client; public class ClientFactory { private final AwsCredentialsProvider awsCredentialsProvider; @@ -32,4 +33,11 @@ public RdsClient buildRdsClient() { .credentialsProvider(awsCredentialsProvider) .build(); } + + public S3Client buildS3Client() { + return S3Client.builder() + .region(awsAuthenticationConfig.getAwsRegion()) + .credentialsProvider(awsCredentialsProvider) + .build(); + } } diff --git a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/RdsService.java b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/RdsService.java index 0e8a92e31d..77956e6b0e 100644 --- a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/RdsService.java +++ b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/RdsService.java @@ -8,13 +8,16 @@ import org.opensearch.dataprepper.metrics.PluginMetrics; import org.opensearch.dataprepper.model.buffer.Buffer; import org.opensearch.dataprepper.model.event.Event; +import org.opensearch.dataprepper.model.event.EventFactory; import org.opensearch.dataprepper.model.record.Record; import org.opensearch.dataprepper.model.source.coordinator.enhanced.EnhancedSourceCoordinator; +import org.opensearch.dataprepper.plugins.source.rds.export.DataFileScheduler; import org.opensearch.dataprepper.plugins.source.rds.export.ExportScheduler; import org.opensearch.dataprepper.plugins.source.rds.leader.LeaderScheduler; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import software.amazon.awssdk.services.rds.RdsClient; +import software.amazon.awssdk.services.s3.S3Client; import java.util.ArrayList; import java.util.List; @@ -24,21 +27,34 @@ public class RdsService { private static final Logger LOG = LoggerFactory.getLogger(RdsService.class); + /** + * Maximum concurrent data loader per node + */ + public static final int DATA_LOADER_MAX_JOB_COUNT = 1; + private final RdsClient rdsClient; + private final S3Client s3Client; private final EnhancedSourceCoordinator sourceCoordinator; + private final EventFactory eventFactory; private final PluginMetrics pluginMetrics; private final RdsSourceConfig sourceConfig; private ExecutorService executor; + private LeaderScheduler leaderScheduler; + private ExportScheduler exportScheduler; + private DataFileScheduler dataFileScheduler; public RdsService(final EnhancedSourceCoordinator sourceCoordinator, final RdsSourceConfig sourceConfig, + final EventFactory eventFactory, final ClientFactory clientFactory, final PluginMetrics pluginMetrics) { this.sourceCoordinator = sourceCoordinator; + this.eventFactory = eventFactory; this.pluginMetrics = pluginMetrics; this.sourceConfig = sourceConfig; rdsClient = clientFactory.buildRdsClient(); + s3Client = clientFactory.buildS3Client(); } /** @@ -51,8 +67,16 @@ public RdsService(final EnhancedSourceCoordinator sourceCoordinator, public void start(Buffer> buffer) { LOG.info("Start running RDS service"); final List runnableList = new ArrayList<>(); - runnableList.add(new LeaderScheduler(sourceCoordinator, sourceConfig)); - runnableList.add(new ExportScheduler(sourceCoordinator, rdsClient, pluginMetrics)); + leaderScheduler = new LeaderScheduler(sourceCoordinator, sourceConfig); + runnableList.add(leaderScheduler); + + if (sourceConfig.isExportEnabled()) { + exportScheduler = new ExportScheduler(sourceCoordinator, rdsClient, s3Client, pluginMetrics); + dataFileScheduler = new DataFileScheduler( + sourceCoordinator, sourceConfig, s3Client, eventFactory, buffer); + runnableList.add(exportScheduler); + runnableList.add(dataFileScheduler); + } executor = Executors.newFixedThreadPool(runnableList.size()); runnableList.forEach(executor::submit); @@ -65,6 +89,11 @@ public void start(Buffer> buffer) { public void shutdown() { if (executor != null) { LOG.info("shutdown RDS schedulers"); + if (sourceConfig.isExportEnabled()) { + exportScheduler.shutdown(); + dataFileScheduler.shutdown(); + } + leaderScheduler.shutdown(); executor.shutdownNow(); } } diff --git a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/RdsSource.java b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/RdsSource.java index cc4bd23ca0..43806c0475 100644 --- a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/RdsSource.java +++ b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/RdsSource.java @@ -11,12 +11,15 @@ import org.opensearch.dataprepper.model.annotations.DataPrepperPluginConstructor; import org.opensearch.dataprepper.model.buffer.Buffer; import org.opensearch.dataprepper.model.event.Event; +import org.opensearch.dataprepper.model.event.EventFactory; import org.opensearch.dataprepper.model.record.Record; import org.opensearch.dataprepper.model.source.Source; import org.opensearch.dataprepper.model.source.coordinator.SourcePartitionStoreItem; import org.opensearch.dataprepper.model.source.coordinator.enhanced.EnhancedSourceCoordinator; import org.opensearch.dataprepper.model.source.coordinator.enhanced.EnhancedSourcePartition; import org.opensearch.dataprepper.model.source.coordinator.enhanced.UsesEnhancedSourceCoordination; +import org.opensearch.dataprepper.plugins.source.rds.coordination.PartitionFactory; +import org.opensearch.dataprepper.plugins.source.rds.coordination.partition.LeaderPartition; import org.slf4j.Logger; import org.slf4j.LoggerFactory; @@ -31,15 +34,18 @@ public class RdsSource implements Source>, UsesEnhancedSourceCoord private final ClientFactory clientFactory; private final PluginMetrics pluginMetrics; private final RdsSourceConfig sourceConfig; + private final EventFactory eventFactory; private EnhancedSourceCoordinator sourceCoordinator; private RdsService rdsService; @DataPrepperPluginConstructor public RdsSource(final PluginMetrics pluginMetrics, final RdsSourceConfig sourceConfig, + final EventFactory eventFactory, final AwsCredentialsSupplier awsCredentialsSupplier) { this.pluginMetrics = pluginMetrics; this.sourceConfig = sourceConfig; + this.eventFactory = eventFactory; clientFactory = new ClientFactory(awsCredentialsSupplier, sourceConfig.getAwsAuthenticationConfig()); } @@ -47,8 +53,9 @@ public RdsSource(final PluginMetrics pluginMetrics, @Override public void start(Buffer> buffer) { Objects.requireNonNull(sourceCoordinator); + sourceCoordinator.createPartition(new LeaderPartition()); - rdsService = new RdsService(sourceCoordinator, sourceConfig, clientFactory, pluginMetrics); + rdsService = new RdsService(sourceCoordinator, sourceConfig, eventFactory, clientFactory, pluginMetrics); LOG.info("Start RDS service"); rdsService.start(buffer); @@ -70,6 +77,6 @@ public void setEnhancedSourceCoordinator(EnhancedSourceCoordinator sourceCoordin @Override public Function getPartitionFactory() { - return null; + return new PartitionFactory(); } } diff --git a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/converter/ExportRecordConverter.java b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/converter/ExportRecordConverter.java new file mode 100644 index 0000000000..11932cd512 --- /dev/null +++ b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/converter/ExportRecordConverter.java @@ -0,0 +1,36 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.converter; + +import org.opensearch.dataprepper.model.event.Event; +import org.opensearch.dataprepper.model.event.EventMetadata; +import org.opensearch.dataprepper.model.record.Record; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import static org.opensearch.dataprepper.plugins.source.rds.converter.MetadataKeyAttributes.EVENT_TABLE_NAME_METADATA_ATTRIBUTE; +import static org.opensearch.dataprepper.plugins.source.rds.converter.MetadataKeyAttributes.INGESTION_EVENT_TYPE_ATTRIBUTE; +import static org.opensearch.dataprepper.plugins.source.rds.converter.MetadataKeyAttributes.PRIMARY_KEY_DOCUMENT_ID_METADATA_ATTRIBUTE; + +public class ExportRecordConverter { + + private static final Logger LOG = LoggerFactory.getLogger(ExportRecordConverter.class); + + static final String EXPORT_EVENT_TYPE = "EXPORT"; + + public Event convert(Record record, String tableName, String primaryKeyName) { + Event event = record.getData(); + + EventMetadata eventMetadata = event.getMetadata(); + eventMetadata.setAttribute(EVENT_TABLE_NAME_METADATA_ATTRIBUTE, tableName); + eventMetadata.setAttribute(INGESTION_EVENT_TYPE_ATTRIBUTE, EXPORT_EVENT_TYPE); + + final Object primaryKeyValue = record.getData().get(primaryKeyName, Object.class); + eventMetadata.setAttribute(PRIMARY_KEY_DOCUMENT_ID_METADATA_ATTRIBUTE, primaryKeyValue); + + return event; + } +} diff --git a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/converter/MetadataKeyAttributes.java b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/converter/MetadataKeyAttributes.java new file mode 100644 index 0000000000..91eecdf07b --- /dev/null +++ b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/converter/MetadataKeyAttributes.java @@ -0,0 +1,20 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.converter; + +public class MetadataKeyAttributes { + static final String PRIMARY_KEY_DOCUMENT_ID_METADATA_ATTRIBUTE = "primary_key"; + + static final String EVENT_VERSION_FROM_TIMESTAMP = "document_version"; + + static final String EVENT_TIMESTAMP_METADATA_ATTRIBUTE = "event_timestamp"; + + static final String EVENT_NAME_BULK_ACTION_METADATA_ATTRIBUTE = "opensearch_action"; + + static final String EVENT_TABLE_NAME_METADATA_ATTRIBUTE = "table_name"; + + static final String INGESTION_EVENT_TYPE_ATTRIBUTE = "ingestion_type"; +} diff --git a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/coordination/PartitionFactory.java b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/coordination/PartitionFactory.java new file mode 100644 index 0000000000..6213263b09 --- /dev/null +++ b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/coordination/PartitionFactory.java @@ -0,0 +1,38 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.coordination; + +import org.opensearch.dataprepper.model.source.coordinator.SourcePartitionStoreItem; +import org.opensearch.dataprepper.model.source.coordinator.enhanced.EnhancedSourcePartition; +import org.opensearch.dataprepper.plugins.source.rds.coordination.partition.DataFilePartition; +import org.opensearch.dataprepper.plugins.source.rds.coordination.partition.ExportPartition; +import org.opensearch.dataprepper.plugins.source.rds.coordination.partition.GlobalState; +import org.opensearch.dataprepper.plugins.source.rds.coordination.partition.LeaderPartition; + +import java.util.function.Function; + +/** + * Partition factory to map a {@link SourcePartitionStoreItem} to a {@link EnhancedSourcePartition}. + */ +public class PartitionFactory implements Function { + + @Override + public EnhancedSourcePartition apply(SourcePartitionStoreItem partitionStoreItem) { + String sourceIdentifier = partitionStoreItem.getSourceIdentifier(); + String partitionType = sourceIdentifier.substring(sourceIdentifier.lastIndexOf('|') + 1); + + if (LeaderPartition.PARTITION_TYPE.equals(partitionType)) { + return new LeaderPartition(partitionStoreItem); + } else if (ExportPartition.PARTITION_TYPE.equals(partitionType)) { + return new ExportPartition(partitionStoreItem); + } else if (DataFilePartition.PARTITION_TYPE.equals(partitionType)) { + return new DataFilePartition(partitionStoreItem); + } else { + // Unable to acquire other partitions. + return new GlobalState(partitionStoreItem); + } + } +} diff --git a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/coordination/partition/DataFilePartition.java b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/coordination/partition/DataFilePartition.java new file mode 100644 index 0000000000..985f48b652 --- /dev/null +++ b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/coordination/partition/DataFilePartition.java @@ -0,0 +1,77 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.coordination.partition; + +import org.opensearch.dataprepper.model.source.coordinator.SourcePartitionStoreItem; +import org.opensearch.dataprepper.model.source.coordinator.enhanced.EnhancedSourcePartition; +import org.opensearch.dataprepper.plugins.source.rds.coordination.state.DataFileProgressState; + +import java.util.Optional; + +/** + * An DataFilePartition represents an export data file needs to be loaded. + * The source identifier contains keyword 'DATAFILE' + */ +public class DataFilePartition extends EnhancedSourcePartition { + + public static final String PARTITION_TYPE = "DATAFILE"; + + private final String exportTaskId; + private final String bucket; + private final String key; + private final DataFileProgressState state; + + public DataFilePartition(final SourcePartitionStoreItem sourcePartitionStoreItem) { + + setSourcePartitionStoreItem(sourcePartitionStoreItem); + String[] keySplits = sourcePartitionStoreItem.getSourcePartitionKey().split("\\|"); + exportTaskId = keySplits[0]; + bucket = keySplits[1]; + key = keySplits[2]; + state = convertStringToPartitionProgressState(DataFileProgressState.class, sourcePartitionStoreItem.getPartitionProgressState()); + + } + + public DataFilePartition(final String exportTaskId, + final String bucket, + final String key, + final Optional state) { + this.exportTaskId = exportTaskId; + this.bucket = bucket; + this.key = key; + this.state = state.orElse(null); + } + + @Override + public String getPartitionType() { + return PARTITION_TYPE; + } + + @Override + public String getPartitionKey() { + return exportTaskId + "|" + bucket + "|" + key; + } + + @Override + public Optional getProgressState() { + if (state != null) { + return Optional.of(state); + } + return Optional.empty(); + } + + public String getExportTaskId() { + return exportTaskId; + } + + public String getBucket() { + return bucket; + } + + public String getKey() { + return key; + } +} diff --git a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/coordination/partition/ExportPartition.java b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/coordination/partition/ExportPartition.java new file mode 100644 index 0000000000..5d79378dec --- /dev/null +++ b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/coordination/partition/ExportPartition.java @@ -0,0 +1,68 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.coordination.partition; + +import org.opensearch.dataprepper.model.source.coordinator.SourcePartitionStoreItem; +import org.opensearch.dataprepper.model.source.coordinator.enhanced.EnhancedSourcePartition; +import org.opensearch.dataprepper.plugins.source.rds.coordination.state.ExportProgressState; + +import java.util.Optional; + +/** + * An ExportPartition represents an export job needs to be run for tables. + * Each export job has an export time associate with it. + * Each job maintains the state such as total files/records etc. independently. + * The source identifier contains keyword 'EXPORT' + */ +public class ExportPartition extends EnhancedSourcePartition { + public static final String PARTITION_TYPE = "EXPORT"; + + private static final String DB_CLUSTER = "cluster"; + private static final String DB_INSTANCE = "instance"; + + private final String dbIdentifier; + + private final boolean isCluster; + + private final ExportProgressState progressState; + + public ExportPartition(String dbIdentifier, boolean isCluster, ExportProgressState progressState) { + this.dbIdentifier = dbIdentifier; + this.isCluster = isCluster; + this.progressState = progressState; + } + + public ExportPartition(SourcePartitionStoreItem sourcePartitionStoreItem) { + setSourcePartitionStoreItem(sourcePartitionStoreItem); + String [] keySplits = sourcePartitionStoreItem.getSourcePartitionKey().split("\\|"); + dbIdentifier = keySplits[0]; + isCluster = DB_CLUSTER.equals(keySplits[1]); + progressState = convertStringToPartitionProgressState(ExportProgressState.class, sourcePartitionStoreItem.getPartitionProgressState()); + } + + @Override + public String getPartitionType() { + return PARTITION_TYPE; + } + + @Override + public String getPartitionKey() { + final String dbType = isCluster ? DB_CLUSTER : DB_INSTANCE; + return dbIdentifier + "|" + dbType; + } + + @Override + public Optional getProgressState() { + if (progressState != null) { + return Optional.of(progressState); + } + return Optional.empty(); + } + + public String getDbIdentifier() { + return dbIdentifier; + } +} diff --git a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/coordination/partition/GlobalState.java b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/coordination/partition/GlobalState.java new file mode 100644 index 0000000000..c6f1d394a2 --- /dev/null +++ b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/coordination/partition/GlobalState.java @@ -0,0 +1,52 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.coordination.partition; + +import org.opensearch.dataprepper.model.source.coordinator.SourcePartitionStoreItem; +import org.opensearch.dataprepper.model.source.coordinator.enhanced.EnhancedSourcePartition; + +import java.util.Map; +import java.util.Optional; + +public class GlobalState extends EnhancedSourcePartition> { + + private final String stateName; + + private Map state; + + public GlobalState(String stateName, Map state) { + this.stateName = stateName; + this.state = state; + } + + public GlobalState(SourcePartitionStoreItem sourcePartitionStoreItem) { + setSourcePartitionStoreItem(sourcePartitionStoreItem); + stateName = sourcePartitionStoreItem.getSourcePartitionKey(); + state = convertStringToPartitionProgressState(null, sourcePartitionStoreItem.getPartitionProgressState()); + } + + @Override + public String getPartitionType() { + return null; + } + + @Override + public String getPartitionKey() { + return stateName; + } + + @Override + public Optional> getProgressState() { + if (state != null) { + return Optional.of(state); + } + return Optional.empty(); + } + + public void setProgressState(Map state) { + this.state = state; + } +} diff --git a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/coordination/partition/LeaderPartition.java b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/coordination/partition/LeaderPartition.java new file mode 100644 index 0000000000..806b199998 --- /dev/null +++ b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/coordination/partition/LeaderPartition.java @@ -0,0 +1,55 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.coordination.partition; + +import org.opensearch.dataprepper.model.source.coordinator.SourcePartitionStoreItem; +import org.opensearch.dataprepper.model.source.coordinator.enhanced.EnhancedSourcePartition; +import org.opensearch.dataprepper.plugins.source.rds.coordination.state.LeaderProgressState; + +import java.util.Optional; + +/** + *

A LeaderPartition is for some tasks that should be done in a single node only.

+ *

Hence whatever node owns the lease of this partition will be acted as a 'leader'.

+ *

In this DynamoDB source design, a leader node will be responsible for:

+ *
    + *
  • Initialization process (create EXPORT and STREAM partitions)
  • + *
  • Triggering RDS export task
  • + *
  • Reading stream data
  • + *
+ */ +public class LeaderPartition extends EnhancedSourcePartition { + public static final String PARTITION_TYPE = "LEADER"; + + // identifier for the partition + private static final String DEFAULT_PARTITION_KEY = "GLOBAL"; + + private final LeaderProgressState state; + + public LeaderPartition() { + this.state = new LeaderProgressState(); + } + + public LeaderPartition(SourcePartitionStoreItem partitionStoreItem) { + setSourcePartitionStoreItem(partitionStoreItem); + this.state = convertStringToPartitionProgressState(LeaderProgressState.class, partitionStoreItem.getPartitionProgressState()); + } + + @Override + public String getPartitionType() { + return PARTITION_TYPE; + } + + @Override + public String getPartitionKey() { + return DEFAULT_PARTITION_KEY; + } + + @Override + public Optional getProgressState() { + return Optional.of(state); + } +} diff --git a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/coordination/state/DataFileProgressState.java b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/coordination/state/DataFileProgressState.java new file mode 100644 index 0000000000..c65c0bbe01 --- /dev/null +++ b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/coordination/state/DataFileProgressState.java @@ -0,0 +1,44 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.coordination.state; + +import com.fasterxml.jackson.annotation.JsonProperty; + +public class DataFileProgressState { + + @JsonProperty("isLoaded") + private boolean isLoaded = false; + + @JsonProperty("totalRecords") + private int totalRecords; + + @JsonProperty("sourceTable") + private String sourceTable; + + public int getTotalRecords() { + return totalRecords; + } + + public void setTotalRecords(int totalRecords) { + this.totalRecords = totalRecords; + } + + public boolean getLoaded() { + return isLoaded; + } + + public void setLoaded(boolean loaded) { + this.isLoaded = loaded; + } + + public String getSourceTable() { + return sourceTable; + } + + public void setSourceTable(String sourceTable) { + this.sourceTable = sourceTable; + } +} diff --git a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/coordination/state/ExportProgressState.java b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/coordination/state/ExportProgressState.java new file mode 100644 index 0000000000..cde2be6dd8 --- /dev/null +++ b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/coordination/state/ExportProgressState.java @@ -0,0 +1,115 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.coordination.state; + +import com.fasterxml.jackson.annotation.JsonProperty; + +import java.util.List; + +/** + * Progress state for an EXPORT partition + */ +public class ExportProgressState { + + @JsonProperty("snapshotId") + private String snapshotId; + + @JsonProperty("exportTaskId") + private String exportTaskId; + + @JsonProperty("iamRoleArn") + private String iamRoleArn; + + @JsonProperty("bucket") + private String bucket; + + @JsonProperty("prefix") + private String prefix; + + @JsonProperty("tables") + private List tables; + + @JsonProperty("kmsKeyId") + private String kmsKeyId; + + @JsonProperty("exportTime") + private String exportTime; + + @JsonProperty("status") + private String status; + + public String getSnapshotId() { + return snapshotId; + } + + public void setSnapshotId(String snapshotId) { + this.snapshotId = snapshotId; + } + + public String getExportTaskId() { + return exportTaskId; + } + + public void setExportTaskId(String exportTaskId) { + this.exportTaskId = exportTaskId; + } + + public String getIamRoleArn() { + return iamRoleArn; + } + + public void setIamRoleArn(String iamRoleArn) { + this.iamRoleArn = iamRoleArn; + } + + public String getBucket() { + return bucket; + } + + public void setBucket(String bucket) { + this.bucket = bucket; + } + + public String getPrefix() { + return prefix; + } + + public void setPrefix(String prefix) { + this.prefix = prefix; + } + + public List getTables() { + return tables; + } + + public void setTables(List tables) { + this.tables = tables; + } + + public String getKmsKeyId() { + return kmsKeyId; + } + + public void setKmsKeyId(String kmsKeyId) { + this.kmsKeyId = kmsKeyId; + } + + public String getExportTime() { + return exportTime; + } + + public void setExportTime(String exportTime) { + this.exportTime = exportTime; + } + + public String getStatus() { + return status; + } + + public void setStatus(String status) { + this.status = status; + } +} diff --git a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/coordination/state/LeaderProgressState.java b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/coordination/state/LeaderProgressState.java new file mode 100644 index 0000000000..216fb64fae --- /dev/null +++ b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/coordination/state/LeaderProgressState.java @@ -0,0 +1,25 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.coordination.state; + +import com.fasterxml.jackson.annotation.JsonProperty; + +/** + * Progress state for a LEADER partition + */ +public class LeaderProgressState { + + @JsonProperty("initialized") + private boolean initialized = false; + + public boolean isInitialized() { + return initialized; + } + + public void setInitialized(boolean initialized) { + this.initialized = initialized; + } +} diff --git a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/export/DataFileLoader.java b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/export/DataFileLoader.java new file mode 100644 index 0000000000..e76a04e99d --- /dev/null +++ b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/export/DataFileLoader.java @@ -0,0 +1,83 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.export; + +import org.opensearch.dataprepper.buffer.common.BufferAccumulator; +import org.opensearch.dataprepper.model.codec.InputCodec; +import org.opensearch.dataprepper.model.event.Event; +import org.opensearch.dataprepper.model.record.Record; +import org.opensearch.dataprepper.plugins.source.rds.converter.ExportRecordConverter; +import org.opensearch.dataprepper.plugins.source.rds.coordination.partition.DataFilePartition; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import java.io.InputStream; + +public class DataFileLoader implements Runnable { + + private static final Logger LOG = LoggerFactory.getLogger(DataFileLoader.class); + + private final DataFilePartition dataFilePartition; + private final String bucket; + private final String objectKey; + private final S3ObjectReader objectReader; + private final InputCodec codec; + private final BufferAccumulator> bufferAccumulator; + private final ExportRecordConverter recordConverter; + + private DataFileLoader(final DataFilePartition dataFilePartition, + final InputCodec codec, + final BufferAccumulator> bufferAccumulator, + final S3ObjectReader objectReader, + final ExportRecordConverter recordConverter) { + this.dataFilePartition = dataFilePartition; + bucket = dataFilePartition.getBucket(); + objectKey = dataFilePartition.getKey(); + this.objectReader = objectReader; + this.codec = codec; + this.bufferAccumulator = bufferAccumulator; + this.recordConverter = recordConverter; + } + + public static DataFileLoader create(final DataFilePartition dataFilePartition, + final InputCodec codec, + final BufferAccumulator> bufferAccumulator, + final S3ObjectReader objectReader, + final ExportRecordConverter recordConverter) { + return new DataFileLoader(dataFilePartition, codec, bufferAccumulator, objectReader, recordConverter); + } + + @Override + public void run() { + LOG.info("Start loading s3://{}/{}", bucket, objectKey); + + try (InputStream inputStream = objectReader.readFile(bucket, objectKey)) { + + codec.parse(inputStream, record -> { + try { + final String tableName = dataFilePartition.getProgressState().get().getSourceTable(); + // TODO: primary key to be obtained by querying database schema + final String primaryKeyName = "id"; + Record transformedRecord = new Record<>(recordConverter.convert(record, tableName, primaryKeyName)); + bufferAccumulator.add(transformedRecord); + } catch (Exception e) { + throw new RuntimeException(e); + } + }); + + LOG.info("Completed loading object s3://{}/{} to buffer", bucket, objectKey); + } catch (Exception e) { + LOG.error("Failed to load object s3://{}/{} to buffer", bucket, objectKey, e); + throw new RuntimeException(e); + } + + try { + bufferAccumulator.flush(); + } catch (Exception e) { + LOG.error("Failed to write events to buffer", e); + } + } +} diff --git a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/export/DataFileScheduler.java b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/export/DataFileScheduler.java new file mode 100644 index 0000000000..d465d55076 --- /dev/null +++ b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/export/DataFileScheduler.java @@ -0,0 +1,163 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.export; + +import org.opensearch.dataprepper.buffer.common.BufferAccumulator; +import org.opensearch.dataprepper.model.buffer.Buffer; +import org.opensearch.dataprepper.model.codec.InputCodec; +import org.opensearch.dataprepper.model.event.Event; +import org.opensearch.dataprepper.model.event.EventFactory; +import org.opensearch.dataprepper.model.record.Record; +import org.opensearch.dataprepper.model.source.coordinator.enhanced.EnhancedSourceCoordinator; +import org.opensearch.dataprepper.model.source.coordinator.enhanced.EnhancedSourcePartition; +import org.opensearch.dataprepper.plugins.codec.parquet.ParquetInputCodec; +import org.opensearch.dataprepper.plugins.source.rds.RdsSourceConfig; +import org.opensearch.dataprepper.plugins.source.rds.converter.ExportRecordConverter; +import org.opensearch.dataprepper.plugins.source.rds.coordination.partition.DataFilePartition; +import org.opensearch.dataprepper.plugins.source.rds.coordination.partition.GlobalState; +import org.opensearch.dataprepper.plugins.source.rds.model.LoadStatus; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; +import software.amazon.awssdk.services.s3.S3Client; + +import java.time.Duration; +import java.time.Instant; +import java.util.Optional; +import java.util.concurrent.CompletableFuture; +import java.util.concurrent.ExecutorService; +import java.util.concurrent.Executors; +import java.util.concurrent.atomic.AtomicInteger; + +import static org.opensearch.dataprepper.plugins.source.rds.RdsService.DATA_LOADER_MAX_JOB_COUNT; + +public class DataFileScheduler implements Runnable { + + private static final Logger LOG = LoggerFactory.getLogger(DataFileScheduler.class); + + private final AtomicInteger numOfWorkers = new AtomicInteger(0); + + /** + * Default interval to acquire a lease from coordination store + */ + private static final int DEFAULT_LEASE_INTERVAL_MILLIS = 2_000; + + private static final Duration DEFAULT_UPDATE_LOAD_STATUS_TIMEOUT = Duration.ofMinutes(30); + + static final Duration BUFFER_TIMEOUT = Duration.ofSeconds(60); + static final int DEFAULT_BUFFER_BATCH_SIZE = 1_000; + + + private final EnhancedSourceCoordinator sourceCoordinator; + private final ExecutorService executor; + private final RdsSourceConfig sourceConfig; + private final S3ObjectReader objectReader; + private final InputCodec codec; + private final BufferAccumulator> bufferAccumulator; + private final ExportRecordConverter recordConverter; + + private volatile boolean shutdownRequested = false; + + public DataFileScheduler(final EnhancedSourceCoordinator sourceCoordinator, + final RdsSourceConfig sourceConfig, + final S3Client s3Client, + final EventFactory eventFactory, + final Buffer> buffer) { + this.sourceCoordinator = sourceCoordinator; + this.sourceConfig = sourceConfig; + codec = new ParquetInputCodec(eventFactory); + bufferAccumulator = BufferAccumulator.create(buffer, DEFAULT_BUFFER_BATCH_SIZE, BUFFER_TIMEOUT); + objectReader = new S3ObjectReader(s3Client); + recordConverter = new ExportRecordConverter(); + executor = Executors.newFixedThreadPool(DATA_LOADER_MAX_JOB_COUNT); + } + + @Override + public void run() { + LOG.debug("Starting Data File Scheduler to process S3 data files for export"); + + while (!shutdownRequested && !Thread.currentThread().isInterrupted()) { + try { + if (numOfWorkers.get() < DATA_LOADER_MAX_JOB_COUNT) { + final Optional sourcePartition = sourceCoordinator.acquireAvailablePartition(DataFilePartition.PARTITION_TYPE); + + if (sourcePartition.isPresent()) { + LOG.debug("Acquired data file partition"); + DataFilePartition dataFilePartition = (DataFilePartition) sourcePartition.get(); + LOG.debug("Start processing data file partition"); + processDataFilePartition(dataFilePartition); + } + } + try { + Thread.sleep(DEFAULT_LEASE_INTERVAL_MILLIS); + } catch (final InterruptedException e) { + LOG.info("The DataFileScheduler was interrupted while waiting to retry, stopping processing"); + break; + } + } catch (final Exception e) { + LOG.error("Received an exception while processing an S3 data file, backing off and retrying", e); + try { + Thread.sleep(DEFAULT_LEASE_INTERVAL_MILLIS); + } catch (final InterruptedException ex) { + LOG.info("The DataFileScheduler was interrupted while waiting to retry, stopping processing"); + break; + } + } + } + LOG.warn("Data file scheduler is interrupted, stopping all data file loaders..."); + + executor.shutdown(); + } + + public void shutdown() { + shutdownRequested = true; + } + + private void processDataFilePartition(DataFilePartition dataFilePartition) { + Runnable loader = DataFileLoader.create(dataFilePartition, codec, bufferAccumulator, objectReader, recordConverter); + CompletableFuture runLoader = CompletableFuture.runAsync(loader, executor); + + runLoader.whenComplete((v, ex) -> { + if (ex == null) { + // Update global state so we know if all s3 files have been loaded + updateLoadStatus(dataFilePartition.getExportTaskId(), DEFAULT_UPDATE_LOAD_STATUS_TIMEOUT); + sourceCoordinator.completePartition(dataFilePartition); + } else { + LOG.error("There was an exception while processing an S3 data file", (Throwable) ex); + sourceCoordinator.giveUpPartition(dataFilePartition); + } + numOfWorkers.decrementAndGet(); + }); + numOfWorkers.incrementAndGet(); + } + + private void updateLoadStatus(String exportTaskId, Duration timeout) { + + Instant endTime = Instant.now().plus(timeout); + // Keep retrying in case update fails due to conflicts until timed out + while (Instant.now().isBefore(endTime)) { + Optional globalStatePartition = sourceCoordinator.getPartition(exportTaskId); + if (globalStatePartition.isEmpty()) { + LOG.error("Failed to get data file load status for {}", exportTaskId); + return; + } + + GlobalState globalState = (GlobalState) globalStatePartition.get(); + LoadStatus loadStatus = LoadStatus.fromMap(globalState.getProgressState().get()); + loadStatus.setLoadedFiles(loadStatus.getLoadedFiles() + 1); + LOG.info("Current data file load status: total {} loaded {}", loadStatus.getTotalFiles(), loadStatus.getLoadedFiles()); + + globalState.setProgressState(loadStatus.toMap()); + + try { + sourceCoordinator.saveProgressStateForPartition(globalState, null); + // TODO: Stream is enabled and loadStatus.getLoadedFiles() == loadStatus.getTotalFiles(), create global state to indicate that stream can start + break; + } catch (Exception e) { + LOG.error("Failed to update the global status, looks like the status was out of date, will retry.."); + } + } + } +} diff --git a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/export/ExportScheduler.java b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/export/ExportScheduler.java index 9c83643c68..abcbd2c1f4 100644 --- a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/export/ExportScheduler.java +++ b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/export/ExportScheduler.java @@ -7,13 +7,37 @@ import org.opensearch.dataprepper.metrics.PluginMetrics; import org.opensearch.dataprepper.model.source.coordinator.enhanced.EnhancedSourceCoordinator; +import org.opensearch.dataprepper.model.source.coordinator.enhanced.EnhancedSourcePartition; +import org.opensearch.dataprepper.plugins.source.rds.coordination.partition.DataFilePartition; +import org.opensearch.dataprepper.plugins.source.rds.coordination.partition.ExportPartition; +import org.opensearch.dataprepper.plugins.source.rds.coordination.partition.GlobalState; +import org.opensearch.dataprepper.plugins.source.rds.coordination.state.DataFileProgressState; +import org.opensearch.dataprepper.plugins.source.rds.coordination.state.ExportProgressState; +import org.opensearch.dataprepper.plugins.source.rds.model.ExportObjectKey; +import org.opensearch.dataprepper.plugins.source.rds.model.ExportStatus; +import org.opensearch.dataprepper.plugins.source.rds.model.LoadStatus; +import org.opensearch.dataprepper.plugins.source.rds.model.SnapshotInfo; +import org.opensearch.dataprepper.plugins.source.rds.model.SnapshotStatus; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import software.amazon.awssdk.services.rds.RdsClient; +import software.amazon.awssdk.services.s3.S3Client; +import software.amazon.awssdk.services.s3.model.ListObjectsV2Request; +import software.amazon.awssdk.services.s3.model.ListObjectsV2Response; +import software.amazon.awssdk.services.s3.model.S3Object; import java.time.Duration; +import java.time.Instant; +import java.util.ArrayList; +import java.util.List; +import java.util.Optional; +import java.util.concurrent.Callable; +import java.util.concurrent.CompletableFuture; import java.util.concurrent.ExecutorService; import java.util.concurrent.Executors; +import java.util.concurrent.atomic.AtomicInteger; +import java.util.function.BiConsumer; +import java.util.stream.Collectors; public class ExportScheduler implements Runnable { private static final Logger LOG = LoggerFactory.getLogger(ExportScheduler.class); @@ -23,26 +47,272 @@ public class ExportScheduler implements Runnable { private static final int DEFAULT_MAX_CLOSE_COUNT = 36; private static final int DEFAULT_CHECKPOINT_INTERVAL_MILLS = 5 * 60_000; private static final int DEFAULT_CHECK_STATUS_INTERVAL_MILLS = 30 * 1000; + private static final Duration DEFAULT_SNAPSHOT_STATUS_CHECK_TIMEOUT = Duration.ofMinutes(60); + static final String PARQUET_SUFFIX = ".parquet"; private final RdsClient rdsClient; - + private final S3Client s3Client; private final PluginMetrics pluginMetrics; - private final EnhancedSourceCoordinator sourceCoordinator; - private final ExecutorService executor; + private final ExportTaskManager exportTaskManager; + private final SnapshotManager snapshotManager; + + private volatile boolean shutdownRequested = false; public ExportScheduler(final EnhancedSourceCoordinator sourceCoordinator, final RdsClient rdsClient, + final S3Client s3Client, final PluginMetrics pluginMetrics) { this.pluginMetrics = pluginMetrics; this.sourceCoordinator = sourceCoordinator; this.rdsClient = rdsClient; + this.s3Client = s3Client; this.executor = Executors.newCachedThreadPool(); + this.exportTaskManager = new ExportTaskManager(rdsClient); + this.snapshotManager = new SnapshotManager(rdsClient); } @Override public void run() { + LOG.debug("Start running Export Scheduler"); + while (!shutdownRequested && !Thread.currentThread().isInterrupted()) { + try { + final Optional sourcePartition = sourceCoordinator.acquireAvailablePartition(ExportPartition.PARTITION_TYPE); + + if (sourcePartition.isPresent()) { + ExportPartition exportPartition = (ExportPartition) sourcePartition.get(); + LOG.debug("Acquired an export partition: {}", exportPartition.getPartitionKey()); + + String exportTaskId = getOrCreateExportTaskId(exportPartition); + + if (exportTaskId == null) { + LOG.error("The export to S3 failed, it will be retried"); + closeExportPartitionWithError(exportPartition); + } else { + CheckExportStatusRunner checkExportStatusRunner = new CheckExportStatusRunner(sourceCoordinator, exportTaskManager, exportPartition); + CompletableFuture checkStatus = CompletableFuture.supplyAsync(checkExportStatusRunner::call, executor); + checkStatus.whenComplete(completeExport(exportPartition)); + } + } + + try { + Thread.sleep(DEFAULT_TAKE_LEASE_INTERVAL_MILLIS); + } catch (final InterruptedException e) { + LOG.info("The ExportScheduler was interrupted while waiting to retry, stopping processing"); + break; + } + } catch (final Exception e) { + LOG.error("Received an exception during export, backing off and retrying", e); + try { + Thread.sleep(DEFAULT_TAKE_LEASE_INTERVAL_MILLIS); + } catch (final InterruptedException ex) { + LOG.info("The ExportScheduler was interrupted while waiting to retry, stopping processing"); + break; + } + } + } + LOG.warn("Export scheduler interrupted, looks like shutdown has triggered"); + executor.shutdownNow(); + } + + public void shutdown() { + shutdownRequested = true; + } + + private String getOrCreateExportTaskId(ExportPartition exportPartition) { + ExportProgressState progressState = exportPartition.getProgressState().get(); + + if (progressState.getExportTaskId() != null) { + LOG.info("Export task has already created for db {}", exportPartition.getDbIdentifier()); + return progressState.getExportTaskId(); + } + + LOG.info("Creating a new snapshot for db {}", exportPartition.getDbIdentifier()); + SnapshotInfo snapshotInfo = snapshotManager.createSnapshot(exportPartition.getDbIdentifier()); + if (snapshotInfo != null) { + LOG.info("Snapshot id is {}", snapshotInfo.getSnapshotId()); + progressState.setSnapshotId(snapshotInfo.getSnapshotId()); + sourceCoordinator.saveProgressStateForPartition(exportPartition, null); + } else { + LOG.error("The snapshot failed to create, it will be retried"); + closeExportPartitionWithError(exportPartition); + return null; + } + + final String snapshotId = snapshotInfo.getSnapshotId(); + try { + checkSnapshotStatus(snapshotId, DEFAULT_SNAPSHOT_STATUS_CHECK_TIMEOUT); + } catch (Exception e) { + LOG.warn("Check snapshot status for {} failed", snapshotId, e); + sourceCoordinator.giveUpPartition(exportPartition); + return null; + } + + LOG.info("Creating an export task for db {} from snapshot {}", exportPartition.getDbIdentifier(), snapshotId); + String exportTaskId = exportTaskManager.startExportTask( + snapshotInfo.getSnapshotArn(), progressState.getIamRoleArn(), progressState.getBucket(), + progressState.getPrefix(), progressState.getKmsKeyId(), progressState.getTables()); + + if (exportTaskId != null) { + LOG.info("Export task id is {}", exportTaskId); + progressState.setExportTaskId(exportTaskId); + sourceCoordinator.saveProgressStateForPartition(exportPartition, null); + } else { + LOG.error("The export task failed to create, it will be retried"); + closeExportPartitionWithError(exportPartition); + return null; + } + + return exportTaskId; + } + + private void closeExportPartitionWithError(ExportPartition exportPartition) { + ExportProgressState exportProgressState = exportPartition.getProgressState().get(); + // Clear current task id, so that a new export can be submitted. + exportProgressState.setExportTaskId(null); + sourceCoordinator.closePartition(exportPartition, DEFAULT_CLOSE_DURATION, DEFAULT_MAX_CLOSE_COUNT); + } + + private String checkSnapshotStatus(String snapshotId, Duration timeout) { + final Instant endTime = Instant.now().plus(timeout); + + LOG.debug("Start checking status of snapshot {}", snapshotId); + while (Instant.now().isBefore(endTime)) { + SnapshotInfo snapshotInfo = snapshotManager.checkSnapshotStatus(snapshotId); + String status = snapshotInfo.getStatus(); + // Valid snapshot statuses are: available, copying, creating + // The status should never be "copying" here + if (SnapshotStatus.AVAILABLE.getStatusName().equals(status)) { + LOG.info("Snapshot {} is available.", snapshotId); + return status; + } + + LOG.debug("Snapshot {} is still creating. Wait and check later", snapshotId); + try { + Thread.sleep(DEFAULT_CHECK_STATUS_INTERVAL_MILLS); + } catch (InterruptedException e) { + throw new RuntimeException(e); + } + } + throw new RuntimeException("Snapshot status check timed out."); + } + + static class CheckExportStatusRunner implements Callable { + private final EnhancedSourceCoordinator sourceCoordinator; + private final ExportTaskManager exportTaskManager; + private final ExportPartition exportPartition; + + CheckExportStatusRunner(EnhancedSourceCoordinator sourceCoordinator, ExportTaskManager exportTaskManager, ExportPartition exportPartition) { + this.sourceCoordinator = sourceCoordinator; + this.exportTaskManager = exportTaskManager; + this.exportPartition = exportPartition; + } + + @Override + public String call() { + return checkExportStatus(exportPartition); + } + + private String checkExportStatus(ExportPartition exportPartition) { + long lastCheckpointTime = System.currentTimeMillis(); + String exportTaskId = exportPartition.getProgressState().get().getExportTaskId(); + + LOG.debug("Start checking the status of export {}", exportTaskId); + while (true) { + if (System.currentTimeMillis() - lastCheckpointTime > DEFAULT_CHECKPOINT_INTERVAL_MILLS) { + sourceCoordinator.saveProgressStateForPartition(exportPartition, null); + lastCheckpointTime = System.currentTimeMillis(); + } + + // Valid statuses are: CANCELED, CANCELING, COMPLETE, FAILED, IN_PROGRESS, STARTING + String status = exportTaskManager.checkExportStatus(exportTaskId); + LOG.debug("Current export status is {}.", status); + if (ExportStatus.isTerminal(status)) { + LOG.info("Export {} is completed with final status {}", exportTaskId, status); + return status; + } + LOG.debug("Export {} is still running in progress. Wait and check later", exportTaskId); + try { + Thread.sleep(DEFAULT_CHECK_STATUS_INTERVAL_MILLS); + } catch (InterruptedException e) { + throw new RuntimeException(e); + } + } + } + } + + private BiConsumer completeExport(ExportPartition exportPartition) { + return (status, ex) -> { + if (ex != null) { + LOG.warn("Check export status for {} failed", exportPartition.getPartitionKey(), ex); + sourceCoordinator.giveUpPartition(exportPartition); + } else { + if (!ExportStatus.COMPLETE.name().equals(status)) { + LOG.error("Export failed with status {}", status); + closeExportPartitionWithError(exportPartition); + return; + } + LOG.info("Export for {} completed successfully", exportPartition.getPartitionKey()); + + ExportProgressState state = exportPartition.getProgressState().get(); + String bucket = state.getBucket(); + String prefix = state.getPrefix(); + String exportTaskId = state.getExportTaskId(); + + // Create data file partitions for processing S3 files + List dataFileObjectKeys = getDataFileObjectKeys(bucket, prefix, exportTaskId); + createDataFilePartitions(bucket, exportTaskId, dataFileObjectKeys); + + completeExportPartition(exportPartition); + } + }; + } + + private List getDataFileObjectKeys(String bucket, String prefix, String exportTaskId) { + LOG.debug("Fetching object keys for export data files."); + ListObjectsV2Request.Builder requestBuilder = ListObjectsV2Request.builder() + .bucket(bucket) + .prefix(prefix + "/" + exportTaskId); + + List objectKeys = new ArrayList<>(); + ListObjectsV2Response response = null; + do { + String nextToken = response == null ? null : response.nextContinuationToken(); + response = s3Client.listObjectsV2(requestBuilder + .continuationToken(nextToken) + .build()); + objectKeys.addAll(response.contents().stream() + .map(S3Object::key) + .filter(key -> key.endsWith(PARQUET_SUFFIX)) + .collect(Collectors.toList())); + + } while (response.isTruncated()); + return objectKeys; + } + + private void createDataFilePartitions(String bucket, String exportTaskId, List dataFileObjectKeys) { + LOG.info("Total of {} data files generated for export {}", dataFileObjectKeys.size(), exportTaskId); + AtomicInteger totalFiles = new AtomicInteger(); + for (final String objectKey : dataFileObjectKeys) { + DataFileProgressState progressState = new DataFileProgressState(); + ExportObjectKey exportObjectKey = ExportObjectKey.fromString(objectKey); + String table = exportObjectKey.getTableName(); + progressState.setSourceTable(table); + + DataFilePartition dataFilePartition = new DataFilePartition(exportTaskId, bucket, objectKey, Optional.of(progressState)); + sourceCoordinator.createPartition(dataFilePartition); + totalFiles.getAndIncrement(); + } + + // Create a global state to track overall progress for data file processing + LoadStatus loadStatus = new LoadStatus(totalFiles.get(), 0); + sourceCoordinator.createPartition(new GlobalState(exportTaskId, loadStatus.toMap())); + } + private void completeExportPartition(ExportPartition exportPartition) { + ExportProgressState progressState = exportPartition.getProgressState().get(); + progressState.setStatus("Completed"); + sourceCoordinator.completePartition(exportPartition); } } diff --git a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/export/ExportTaskManager.java b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/export/ExportTaskManager.java new file mode 100644 index 0000000000..dc447c2f42 --- /dev/null +++ b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/export/ExportTaskManager.java @@ -0,0 +1,79 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.export; + +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; +import software.amazon.awssdk.arns.Arn; +import software.amazon.awssdk.services.rds.RdsClient; +import software.amazon.awssdk.services.rds.model.DescribeExportTasksRequest; +import software.amazon.awssdk.services.rds.model.DescribeExportTasksResponse; +import software.amazon.awssdk.services.rds.model.StartExportTaskRequest; +import software.amazon.awssdk.services.rds.model.StartExportTaskResponse; + +import java.util.Collection; +import java.util.UUID; + +public class ExportTaskManager { + + private static final Logger LOG = LoggerFactory.getLogger(ExportTaskManager.class); + + // Export identifier cannot be longer than 60 characters + private static final int EXPORT_TASK_ID_MAX_LENGTH = 60; + + private final RdsClient rdsClient; + + public ExportTaskManager(final RdsClient rdsClient) { + this.rdsClient = rdsClient; + } + + public String startExportTask(String snapshotArn, String iamRoleArn, String bucket, String prefix, String kmsKeyId, Collection includeTables) { + final String exportTaskId = generateExportTaskId(snapshotArn); + StartExportTaskRequest.Builder requestBuilder = StartExportTaskRequest.builder() + .exportTaskIdentifier(exportTaskId) + .sourceArn(snapshotArn) + .iamRoleArn(iamRoleArn) + .s3BucketName(bucket) + .s3Prefix(prefix) + .kmsKeyId(kmsKeyId); + + if (includeTables != null && !includeTables.isEmpty()) { + requestBuilder.exportOnly(includeTables); + } + + try { + StartExportTaskResponse response = rdsClient.startExportTask(requestBuilder.build()); + LOG.info("Export task submitted with id {} and status {}", exportTaskId, response.status()); + return exportTaskId; + + } catch (Exception e) { + LOG.error("Failed to start an export task", e); + return null; + } + } + + public String checkExportStatus(String exportTaskId) { + DescribeExportTasksRequest request = DescribeExportTasksRequest.builder() + .exportTaskIdentifier(exportTaskId) + .build(); + + DescribeExportTasksResponse response = rdsClient.describeExportTasks(request); + + return response.exportTasks().get(0).status(); + } + + private String generateExportTaskId(String snapshotArn) { + String snapshotId = Arn.fromString(snapshotArn).resource().resource(); + return truncateString(snapshotId, EXPORT_TASK_ID_MAX_LENGTH - 16) + "-export-" + UUID.randomUUID().toString().substring(0, 8); + } + + private String truncateString(String originalString, int maxLength) { + if (originalString.length() <= maxLength) { + return originalString; + } + return originalString.substring(0, maxLength); + } +} diff --git a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/export/S3ObjectReader.java b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/export/S3ObjectReader.java new file mode 100644 index 0000000000..39c0079198 --- /dev/null +++ b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/export/S3ObjectReader.java @@ -0,0 +1,36 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.export; + +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; +import software.amazon.awssdk.services.s3.S3Client; +import software.amazon.awssdk.services.s3.model.GetObjectRequest; + +import java.io.InputStream; + +public class S3ObjectReader { + + private static final Logger LOG = LoggerFactory.getLogger(S3ObjectReader.class); + + private final S3Client s3Client; + + public S3ObjectReader(S3Client s3Client) { + this.s3Client = s3Client; + } + + public InputStream readFile(String bucketName, String s3Key) { + LOG.debug("Read file from s3://{}/{}", bucketName, s3Key); + + GetObjectRequest objectRequest = GetObjectRequest.builder() + .bucket(bucketName) + .key(s3Key) + .build(); + + return s3Client.getObject(objectRequest); + } + +} diff --git a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/export/SnapshotManager.java b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/export/SnapshotManager.java new file mode 100644 index 0000000000..7b8da8717c --- /dev/null +++ b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/export/SnapshotManager.java @@ -0,0 +1,66 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.export; + +import org.opensearch.dataprepper.plugins.source.rds.model.SnapshotInfo; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; +import software.amazon.awssdk.services.rds.RdsClient; +import software.amazon.awssdk.services.rds.model.CreateDbSnapshotRequest; +import software.amazon.awssdk.services.rds.model.CreateDbSnapshotResponse; +import software.amazon.awssdk.services.rds.model.DescribeDbSnapshotsRequest; +import software.amazon.awssdk.services.rds.model.DescribeDbSnapshotsResponse; + +import java.time.Instant; +import java.util.UUID; + +public class SnapshotManager { + private static final Logger LOG = LoggerFactory.getLogger(SnapshotManager.class); + + private final RdsClient rdsClient; + + public SnapshotManager(final RdsClient rdsClient) { + this.rdsClient = rdsClient; + } + + public SnapshotInfo createSnapshot(String dbInstanceId) { + final String snapshotId = generateSnapshotId(dbInstanceId); + CreateDbSnapshotRequest request = CreateDbSnapshotRequest.builder() + .dbInstanceIdentifier(dbInstanceId) + .dbSnapshotIdentifier(snapshotId) + .build(); + + try { + CreateDbSnapshotResponse response = rdsClient.createDBSnapshot(request); + String snapshotArn = response.dbSnapshot().dbSnapshotArn(); + String status = response.dbSnapshot().status(); + Instant createTime = response.dbSnapshot().snapshotCreateTime(); + LOG.info("Creating snapshot with id {} and status {}", snapshotId, status); + + return new SnapshotInfo(snapshotId, snapshotArn, createTime, status); + } catch (Exception e) { + LOG.error("Failed to create snapshot for {}", dbInstanceId, e); + return null; + } + } + + public SnapshotInfo checkSnapshotStatus(String snapshotId) { + DescribeDbSnapshotsRequest request = DescribeDbSnapshotsRequest.builder() + .dbSnapshotIdentifier(snapshotId) + .build(); + + DescribeDbSnapshotsResponse response = rdsClient.describeDBSnapshots(request); + String snapshotArn = response.dbSnapshots().get(0).dbSnapshotArn(); + String status = response.dbSnapshots().get(0).status(); + Instant createTime = response.dbSnapshots().get(0).snapshotCreateTime(); + + return new SnapshotInfo(snapshotId, snapshotArn, createTime, status); + } + + private String generateSnapshotId(String dbClusterId) { + return dbClusterId + "-snapshot-" + UUID.randomUUID().toString().substring(0, 8); + } +} diff --git a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/leader/LeaderScheduler.java b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/leader/LeaderScheduler.java index ca99a7c8f1..4831f1e91a 100644 --- a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/leader/LeaderScheduler.java +++ b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/leader/LeaderScheduler.java @@ -6,11 +6,19 @@ package org.opensearch.dataprepper.plugins.source.rds.leader; import org.opensearch.dataprepper.model.source.coordinator.enhanced.EnhancedSourceCoordinator; +import org.opensearch.dataprepper.model.source.coordinator.enhanced.EnhancedSourcePartition; import org.opensearch.dataprepper.plugins.source.rds.RdsSourceConfig; +import org.opensearch.dataprepper.plugins.source.rds.coordination.partition.ExportPartition; +import org.opensearch.dataprepper.plugins.source.rds.coordination.partition.GlobalState; +import org.opensearch.dataprepper.plugins.source.rds.coordination.partition.LeaderPartition; +import org.opensearch.dataprepper.plugins.source.rds.coordination.state.ExportProgressState; +import org.opensearch.dataprepper.plugins.source.rds.coordination.state.LeaderProgressState; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import java.time.Duration; +import java.time.Instant; +import java.util.Optional; public class LeaderScheduler implements Runnable { @@ -20,6 +28,9 @@ public class LeaderScheduler implements Runnable { private final EnhancedSourceCoordinator sourceCoordinator; private final RdsSourceConfig sourceConfig; + private LeaderPartition leaderPartition; + private volatile boolean shutdownRequested = false; + public LeaderScheduler(final EnhancedSourceCoordinator sourceCoordinator, final RdsSourceConfig sourceConfig) { this.sourceCoordinator = sourceCoordinator; this.sourceConfig = sourceConfig; @@ -27,6 +38,84 @@ public LeaderScheduler(final EnhancedSourceCoordinator sourceCoordinator, final @Override public void run() { + LOG.info("Starting Leader Scheduler for initialization."); + + while (!shutdownRequested && !Thread.currentThread().isInterrupted()) { + try { + // Try to acquire the lease if not owned + if (leaderPartition == null) { + final Optional sourcePartition = sourceCoordinator.acquireAvailablePartition(LeaderPartition.PARTITION_TYPE); + if (sourcePartition.isPresent()) { + LOG.info("Running as a LEADER node."); + leaderPartition = (LeaderPartition) sourcePartition.get(); + } + } + + // Once owned, run Normal LEADER node process + if (leaderPartition != null) { + LeaderProgressState leaderProgressState = leaderPartition.getProgressState().get(); + if (!leaderProgressState.isInitialized()) { + init(); + } + } + } catch (final Exception e) { + LOG.error("Exception occurred in primary leader scheduling loop", e); + } finally { + if (leaderPartition != null) { + // Extend the timeout + // will always be a leader until shutdown + sourceCoordinator.saveProgressStateForPartition(leaderPartition, Duration.ofMinutes(DEFAULT_EXTEND_LEASE_MINUTES)); + } + + try { + Thread.sleep(DEFAULT_LEASE_INTERVAL.toMillis()); + } catch (final InterruptedException e) { + LOG.info("InterruptedException occurred while waiting in leader scheduling loop."); + break; + } + } + } + + // Should stop + LOG.warn("Quitting Leader Scheduler"); + if (leaderPartition != null) { + sourceCoordinator.giveUpPartition(leaderPartition); + } + } + + public void shutdown() { + shutdownRequested = true; + } + + private void init() { + LOG.info("Initializing RDS source service..."); + + // Create a Global state in the coordination table for the configuration. + // Global State here is designed to be able to read whenever needed + // So that the jobs can refer to the configuration. + sourceCoordinator.createPartition(new GlobalState(sourceConfig.getDbIdentifier(), null)); + + if (sourceConfig.isExportEnabled()) { + Instant startTime = Instant.now(); + LOG.debug("Export is enabled. Creating export partition in the source coordination store."); + createExportPartition(sourceConfig, startTime); + } + LOG.debug("Update initialization state"); + LeaderProgressState leaderProgressState = leaderPartition.getProgressState().get(); + leaderProgressState.setInitialized(true); } + + private void createExportPartition(RdsSourceConfig sourceConfig, Instant exportTime) { + ExportProgressState progressState = new ExportProgressState(); + progressState.setIamRoleArn(sourceConfig.getAwsAuthenticationConfig().getAwsStsRoleArn()); + progressState.setBucket(sourceConfig.getS3Bucket()); + progressState.setPrefix(sourceConfig.getS3Prefix()); + progressState.setTables(sourceConfig.getTableNames()); + progressState.setKmsKeyId(sourceConfig.getExport().getKmsKeyId()); + progressState.setExportTime(exportTime.toString()); + ExportPartition exportPartition = new ExportPartition(sourceConfig.getDbIdentifier(), sourceConfig.isCluster(), progressState); + sourceCoordinator.createPartition(exportPartition); + } + } diff --git a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/model/ExportObjectKey.java b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/model/ExportObjectKey.java new file mode 100644 index 0000000000..c69dcc7651 --- /dev/null +++ b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/model/ExportObjectKey.java @@ -0,0 +1,68 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.model; + +/** + * Represents the object key for an object exported to S3 by RDS. + * The object key has this structure: "{prefix}/{export task ID}/{database name}/{table name}/{numbered folder}/{file name}" + */ +public class ExportObjectKey { + + private final String prefix; + private final String exportTaskId; + private final String databaseName; + private final String tableName; + private final String numberedFolder; + private final String fileName; + + ExportObjectKey(final String prefix, final String exportTaskId, final String databaseName, final String tableName, final String numberedFolder, final String fileName) { + this.prefix = prefix; + this.exportTaskId = exportTaskId; + this.databaseName = databaseName; + this.tableName = tableName; + this.numberedFolder = numberedFolder; + this.fileName = fileName; + } + + public static ExportObjectKey fromString(final String objectKeyString) { + + final String[] parts = objectKeyString.split("/"); + if (parts.length != 6) { + throw new IllegalArgumentException("Export object key is not valid: " + objectKeyString); + } + final String prefix = parts[0]; + final String exportTaskId = parts[1]; + final String databaseName = parts[2]; + final String tableName = parts[3]; + final String numberedFolder = parts[4]; + final String fileName = parts[5]; + return new ExportObjectKey(prefix, exportTaskId, databaseName, tableName, numberedFolder, fileName); + } + + public String getPrefix() { + return prefix; + } + + public String getExportTaskId() { + return exportTaskId; + } + + public String getDatabaseName() { + return databaseName; + } + + public String getTableName() { + return tableName; + } + + public String getNumberedFolder() { + return numberedFolder; + } + + public String getFileName() { + return fileName; + } +} diff --git a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/model/ExportStatus.java b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/model/ExportStatus.java new file mode 100644 index 0000000000..16fb91b7f4 --- /dev/null +++ b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/model/ExportStatus.java @@ -0,0 +1,36 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.model; + +import java.util.Arrays; +import java.util.Map; +import java.util.Set; +import java.util.stream.Collectors; + +public enum ExportStatus { + CANCELED, + CANCELING, + COMPLETE, + FAILED, + IN_PROGRESS, + STARTING; + + private static final Map TYPES_MAP = Arrays.stream(ExportStatus.values()) + .collect(Collectors.toMap( + Enum::name, + value -> value + )); + private static final Set TERMINAL_STATUSES = Set.of(CANCELED, COMPLETE, FAILED); + + public static ExportStatus fromString(final String name) { + return TYPES_MAP.get(name); + } + + public static boolean isTerminal(final String name) { + ExportStatus status = fromString(name); + return status != null && TERMINAL_STATUSES.contains(status); + } +} \ No newline at end of file diff --git a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/model/LoadStatus.java b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/model/LoadStatus.java new file mode 100644 index 0000000000..a2762c1b38 --- /dev/null +++ b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/model/LoadStatus.java @@ -0,0 +1,53 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.model; + +import java.util.Map; + +public class LoadStatus { + + private static final String TOTAL_FILES = "totalFiles"; + private static final String LOADED_FILES = "loadedFiles"; + + private int totalFiles; + + private int loadedFiles; + + public LoadStatus(int totalFiles, int loadedFiles) { + this.totalFiles = totalFiles; + this.loadedFiles = loadedFiles; + } + + public int getTotalFiles() { + return totalFiles; + } + + public void setTotalFiles(int totalFiles) { + this.totalFiles = totalFiles; + } + + public int getLoadedFiles() { + return loadedFiles; + } + + public void setLoadedFiles(int loadedFiles) { + this.loadedFiles = loadedFiles; + } + + public Map toMap() { + return Map.of( + TOTAL_FILES, totalFiles, + LOADED_FILES, loadedFiles + ); + } + + public static LoadStatus fromMap(Map map) { + return new LoadStatus( + ((Number) map.get(TOTAL_FILES)).intValue(), + ((Number) map.get(LOADED_FILES)).intValue() + ); + } +} diff --git a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/model/SnapshotInfo.java b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/model/SnapshotInfo.java new file mode 100644 index 0000000000..11bd452497 --- /dev/null +++ b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/model/SnapshotInfo.java @@ -0,0 +1,43 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.model; + +import java.time.Instant; + +public class SnapshotInfo { + + private final String snapshotId; + private final String snapshotArn; + private final Instant createTime; + private String status; + + public SnapshotInfo(String snapshotId, String snapshotArn, Instant createTime, String status) { + this.snapshotId = snapshotId; + this.snapshotArn = snapshotArn; + this.createTime = createTime; + this.status = status; + } + + public String getSnapshotId() { + return snapshotId; + } + + public String getSnapshotArn() { + return snapshotArn; + } + + public Instant getCreateTime() { + return createTime; + } + + public String getStatus() { + return status; + } + + public void setStatus(String status) { + this.status = status; + } +} diff --git a/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/model/SnapshotStatus.java b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/model/SnapshotStatus.java new file mode 100644 index 0000000000..a2d18f70f9 --- /dev/null +++ b/data-prepper-plugins/rds-source/src/main/java/org/opensearch/dataprepper/plugins/source/rds/model/SnapshotStatus.java @@ -0,0 +1,22 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.model; + +public enum SnapshotStatus { + AVAILABLE("available"), + COPYING("copying"), + CREATING("creating"); + + private final String statusName; + + SnapshotStatus(final String statusName) { + this.statusName = statusName; + } + + public String getStatusName() { + return statusName; + } +} diff --git a/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/RdsServiceTest.java b/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/RdsServiceTest.java index 218c23d121..7a18dd6159 100644 --- a/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/RdsServiceTest.java +++ b/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/RdsServiceTest.java @@ -14,8 +14,10 @@ import org.opensearch.dataprepper.metrics.PluginMetrics; import org.opensearch.dataprepper.model.buffer.Buffer; import org.opensearch.dataprepper.model.event.Event; +import org.opensearch.dataprepper.model.event.EventFactory; import org.opensearch.dataprepper.model.record.Record; import org.opensearch.dataprepper.model.source.coordinator.enhanced.EnhancedSourceCoordinator; +import org.opensearch.dataprepper.plugins.source.rds.export.DataFileScheduler; import org.opensearch.dataprepper.plugins.source.rds.export.ExportScheduler; import org.opensearch.dataprepper.plugins.source.rds.leader.LeaderScheduler; import software.amazon.awssdk.services.rds.RdsClient; @@ -47,6 +49,9 @@ class RdsServiceTest { @Mock private ExecutorService executor; + @Mock + private EventFactory eventFactory; + @Mock private ClientFactory clientFactory; @@ -56,12 +61,12 @@ class RdsServiceTest { @BeforeEach void setUp() { when(clientFactory.buildRdsClient()).thenReturn(rdsClient); - } @Test - void test_normal_service_start() { + void test_normal_service_start_when_export_is_enabled() { RdsService rdsService = createObjectUnderTest(); + when(sourceConfig.isExportEnabled()).thenReturn(true); try (final MockedStatic executorsMockedStatic = mockStatic(Executors.class)) { executorsMockedStatic.when(() -> Executors.newFixedThreadPool(anyInt())).thenReturn(executor); rdsService.start(buffer); @@ -69,6 +74,7 @@ void test_normal_service_start() { verify(executor).submit(any(LeaderScheduler.class)); verify(executor).submit(any(ExportScheduler.class)); + verify(executor).submit(any(DataFileScheduler.class)); } @Test @@ -84,6 +90,6 @@ void test_service_shutdown_calls_executor_shutdownNow() { } private RdsService createObjectUnderTest() { - return new RdsService(sourceCoordinator, sourceConfig, clientFactory, pluginMetrics); + return new RdsService(sourceCoordinator, sourceConfig, eventFactory, clientFactory, pluginMetrics); } } \ No newline at end of file diff --git a/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/RdsSourceTest.java b/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/RdsSourceTest.java index edd409e5e4..682f16ed51 100644 --- a/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/RdsSourceTest.java +++ b/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/RdsSourceTest.java @@ -12,6 +12,7 @@ import org.mockito.junit.jupiter.MockitoExtension; import org.opensearch.dataprepper.aws.api.AwsCredentialsSupplier; import org.opensearch.dataprepper.metrics.PluginMetrics; +import org.opensearch.dataprepper.model.event.EventFactory; import org.opensearch.dataprepper.plugins.source.rds.configuration.AwsAuthenticationConfig; import static org.junit.jupiter.api.Assertions.assertThrows; @@ -27,6 +28,9 @@ class RdsSourceTest { @Mock private RdsSourceConfig sourceConfig; + @Mock + private EventFactory eventFactory; + @Mock AwsCredentialsSupplier awsCredentialsSupplier; @@ -45,6 +49,6 @@ void test_when_buffer_is_null_then_start_throws_exception() { } private RdsSource createObjectUnderTest() { - return new RdsSource(pluginMetrics, sourceConfig, awsCredentialsSupplier); + return new RdsSource(pluginMetrics, sourceConfig, eventFactory, awsCredentialsSupplier); } } \ No newline at end of file diff --git a/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/converter/ExportRecordConverterTest.java b/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/converter/ExportRecordConverterTest.java new file mode 100644 index 0000000000..79c5597c3b --- /dev/null +++ b/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/converter/ExportRecordConverterTest.java @@ -0,0 +1,51 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.converter; + +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.dataprepper.event.TestEventFactory; +import org.opensearch.dataprepper.model.event.Event; +import org.opensearch.dataprepper.model.event.EventBuilder; +import org.opensearch.dataprepper.model.record.Record; + +import java.util.Map; +import java.util.UUID; + +import static org.hamcrest.MatcherAssert.assertThat; +import static org.hamcrest.Matchers.equalTo; +import static org.hamcrest.Matchers.sameInstance; +import static org.opensearch.dataprepper.plugins.source.rds.converter.ExportRecordConverter.EXPORT_EVENT_TYPE; +import static org.opensearch.dataprepper.plugins.source.rds.converter.MetadataKeyAttributes.EVENT_TABLE_NAME_METADATA_ATTRIBUTE; +import static org.opensearch.dataprepper.plugins.source.rds.converter.MetadataKeyAttributes.INGESTION_EVENT_TYPE_ATTRIBUTE; +import static org.opensearch.dataprepper.plugins.source.rds.converter.MetadataKeyAttributes.PRIMARY_KEY_DOCUMENT_ID_METADATA_ATTRIBUTE; + +@ExtendWith(MockitoExtension.class) +class ExportRecordConverterTest { + + @Test + void test_convert() { + final String tableName = UUID.randomUUID().toString(); + final String primaryKeyName = UUID.randomUUID().toString(); + final String primaryKeyValue = UUID.randomUUID().toString(); + final Event testEvent = TestEventFactory.getTestEventFactory().eventBuilder(EventBuilder.class) + .withEventType("EVENT") + .withData(Map.of(primaryKeyName, primaryKeyValue)) + .build(); + + Record testRecord = new Record<>(testEvent); + + ExportRecordConverter exportRecordConverter = new ExportRecordConverter(); + Event actualEvent = exportRecordConverter.convert(testRecord, tableName, primaryKeyName); + + // Assert + assertThat(actualEvent.getMetadata().getAttribute(EVENT_TABLE_NAME_METADATA_ATTRIBUTE), equalTo(tableName)); + assertThat(actualEvent.getMetadata().getAttribute(PRIMARY_KEY_DOCUMENT_ID_METADATA_ATTRIBUTE), equalTo(primaryKeyValue)); + assertThat(actualEvent.getMetadata().getAttribute(INGESTION_EVENT_TYPE_ATTRIBUTE), equalTo(EXPORT_EVENT_TYPE)); + assertThat(actualEvent, sameInstance(testRecord.getData())); + } +} \ No newline at end of file diff --git a/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/coordination/PartitionFactoryTest.java b/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/coordination/PartitionFactoryTest.java new file mode 100644 index 0000000000..c092a8b48c --- /dev/null +++ b/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/coordination/PartitionFactoryTest.java @@ -0,0 +1,61 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.coordination; + +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.Mock; +import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.dataprepper.model.source.coordinator.SourcePartitionStoreItem; +import org.opensearch.dataprepper.plugins.source.rds.coordination.partition.ExportPartition; +import org.opensearch.dataprepper.plugins.source.rds.coordination.partition.GlobalState; +import org.opensearch.dataprepper.plugins.source.rds.coordination.partition.LeaderPartition; + +import java.util.UUID; + +import static org.hamcrest.MatcherAssert.assertThat; +import static org.hamcrest.Matchers.instanceOf; +import static org.mockito.Mockito.when; + +@ExtendWith(MockitoExtension.class) +class PartitionFactoryTest { + + @Mock + private SourcePartitionStoreItem partitionStoreItem; + + @Test + void given_leader_partition_item_then_create_leader_partition() { + PartitionFactory objectUnderTest = createObjectUnderTest(); + when(partitionStoreItem.getSourceIdentifier()).thenReturn(UUID.randomUUID() + "|" + LeaderPartition.PARTITION_TYPE); + when(partitionStoreItem.getPartitionProgressState()).thenReturn(null); + + assertThat(objectUnderTest.apply(partitionStoreItem), instanceOf(LeaderPartition.class)); + } + + @Test + void given_export_partition_item_then_create_export_partition() { + PartitionFactory objectUnderTest = createObjectUnderTest(); + when(partitionStoreItem.getSourceIdentifier()).thenReturn(UUID.randomUUID() + "|" + ExportPartition.PARTITION_TYPE); + when(partitionStoreItem.getSourcePartitionKey()).thenReturn(UUID.randomUUID() + "|" + UUID.randomUUID()); + when(partitionStoreItem.getPartitionProgressState()).thenReturn(null); + + assertThat(objectUnderTest.apply(partitionStoreItem), instanceOf(ExportPartition.class)); + } + + @Test + void given_store_item_of_undefined_type_then_create_global_state() { + PartitionFactory objectUnderTest = createObjectUnderTest(); + when(partitionStoreItem.getSourceIdentifier()).thenReturn(UUID.randomUUID() + "|" + UUID.randomUUID()); + when(partitionStoreItem.getSourcePartitionKey()).thenReturn(UUID.randomUUID().toString()); + when(partitionStoreItem.getPartitionProgressState()).thenReturn(null); + + assertThat(objectUnderTest.apply(partitionStoreItem), instanceOf(GlobalState.class)); + } + + private PartitionFactory createObjectUnderTest() { + return new PartitionFactory(); + } +} \ No newline at end of file diff --git a/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/export/DataFileLoaderTest.java b/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/export/DataFileLoaderTest.java new file mode 100644 index 0000000000..1ed91bc031 --- /dev/null +++ b/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/export/DataFileLoaderTest.java @@ -0,0 +1,67 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.export; + +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.Mock; +import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.dataprepper.buffer.common.BufferAccumulator; +import org.opensearch.dataprepper.model.codec.InputCodec; +import org.opensearch.dataprepper.model.event.Event; +import org.opensearch.dataprepper.model.record.Record; +import org.opensearch.dataprepper.plugins.source.rds.converter.ExportRecordConverter; +import org.opensearch.dataprepper.plugins.source.rds.coordination.partition.DataFilePartition; + +import java.io.InputStream; +import java.util.UUID; +import java.util.function.Consumer; + +import static org.mockito.ArgumentMatchers.any; +import static org.mockito.ArgumentMatchers.eq; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.verify; +import static org.mockito.Mockito.when; + +@ExtendWith(MockitoExtension.class) +class DataFileLoaderTest { + + @Mock + private DataFilePartition dataFilePartition; + + @Mock + private BufferAccumulator> bufferAccumulator; + + @Mock + private InputCodec codec; + + @Mock + private S3ObjectReader s3ObjectReader; + + @Mock + private ExportRecordConverter recordConverter; + + @Test + void test_run() throws Exception { + final String bucket = UUID.randomUUID().toString(); + final String key = UUID.randomUUID().toString(); + when(dataFilePartition.getBucket()).thenReturn(bucket); + when(dataFilePartition.getKey()).thenReturn(key); + + InputStream inputStream = mock(InputStream.class); + when(s3ObjectReader.readFile(bucket, key)).thenReturn(inputStream); + + DataFileLoader objectUnderTest = createObjectUnderTest(); + objectUnderTest.run(); + + verify(codec).parse(eq(inputStream), any(Consumer.class)); + verify(bufferAccumulator).flush(); + } + + private DataFileLoader createObjectUnderTest() { + return DataFileLoader.create(dataFilePartition, codec, bufferAccumulator, s3ObjectReader, recordConverter); + } +} \ No newline at end of file diff --git a/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/export/DataFileSchedulerTest.java b/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/export/DataFileSchedulerTest.java new file mode 100644 index 0000000000..ee0d0e2852 --- /dev/null +++ b/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/export/DataFileSchedulerTest.java @@ -0,0 +1,137 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.export; + +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.Mock; +import org.mockito.MockedStatic; +import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.dataprepper.buffer.common.BufferAccumulator; +import org.opensearch.dataprepper.model.buffer.Buffer; +import org.opensearch.dataprepper.model.codec.InputCodec; +import org.opensearch.dataprepper.model.event.Event; +import org.opensearch.dataprepper.model.event.EventFactory; +import org.opensearch.dataprepper.model.record.Record; +import org.opensearch.dataprepper.model.source.coordinator.enhanced.EnhancedSourceCoordinator; +import org.opensearch.dataprepper.plugins.source.rds.RdsSourceConfig; +import org.opensearch.dataprepper.plugins.source.rds.converter.ExportRecordConverter; +import org.opensearch.dataprepper.plugins.source.rds.coordination.partition.DataFilePartition; +import org.opensearch.dataprepper.plugins.source.rds.coordination.partition.GlobalState; +import org.opensearch.dataprepper.plugins.source.rds.model.LoadStatus; +import software.amazon.awssdk.services.s3.S3Client; + +import java.time.Duration; +import java.util.Map; +import java.util.Optional; +import java.util.Random; +import java.util.UUID; +import java.util.concurrent.ExecutorService; +import java.util.concurrent.Executors; + +import static org.awaitility.Awaitility.await; +import static org.mockito.ArgumentMatchers.any; +import static org.mockito.ArgumentMatchers.eq; +import static org.mockito.Mockito.doNothing; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.mockStatic; +import static org.mockito.Mockito.verify; +import static org.mockito.Mockito.verifyNoInteractions; +import static org.mockito.Mockito.verifyNoMoreInteractions; +import static org.mockito.Mockito.when; + +@ExtendWith(MockitoExtension.class) +class DataFileSchedulerTest { + + @Mock + private EnhancedSourceCoordinator sourceCoordinator; + + @Mock + private RdsSourceConfig sourceConfig; + + @Mock + private S3Client s3Client; + + @Mock + private EventFactory eventFactory; + + @Mock + private Buffer> buffer; + + @Mock + private DataFilePartition dataFilePartition; + + private Random random; + + @BeforeEach + void setUp() { + random = new Random(); + } + + @Test + void test_given_no_datafile_partition_then_no_export() throws InterruptedException { + when(sourceCoordinator.acquireAvailablePartition(DataFilePartition.PARTITION_TYPE)).thenReturn(Optional.empty()); + + final DataFileScheduler objectUnderTest = createObjectUnderTest(); + final ExecutorService executorService = Executors.newSingleThreadExecutor(); + executorService.submit(objectUnderTest); + await().atMost(Duration.ofSeconds(1)) + .untilAsserted(() -> verify(sourceCoordinator).acquireAvailablePartition(DataFilePartition.PARTITION_TYPE)); + Thread.sleep(100); + executorService.shutdownNow(); + + verifyNoInteractions(s3Client, buffer); + } + + @Test + void test_given_available_datafile_partition_then_load_datafile() { + DataFileScheduler objectUnderTest = createObjectUnderTest(); + final String exportTaskId = UUID.randomUUID().toString(); + when(dataFilePartition.getExportTaskId()).thenReturn(exportTaskId); + + when(sourceCoordinator.acquireAvailablePartition(DataFilePartition.PARTITION_TYPE)).thenReturn(Optional.of(dataFilePartition)); + final GlobalState globalStatePartition = mock(GlobalState.class); + final int totalFiles = random.nextInt() + 1; + final Map loadStatusMap = new LoadStatus(totalFiles, totalFiles - 1).toMap(); + when(globalStatePartition.getProgressState()).thenReturn(Optional.of(loadStatusMap)); + when(sourceCoordinator.getPartition(exportTaskId)).thenReturn(Optional.of(globalStatePartition)); + + final ExecutorService executorService = Executors.newSingleThreadExecutor(); + executorService.submit(() -> { + // MockedStatic needs to be created on the same thread it's used + try (MockedStatic dataFileLoaderMockedStatic = mockStatic(DataFileLoader.class)) { + DataFileLoader dataFileLoader = mock(DataFileLoader.class); + dataFileLoaderMockedStatic.when(() -> DataFileLoader.create( + eq(dataFilePartition), any(InputCodec.class), any(BufferAccumulator.class), any(S3ObjectReader.class), any(ExportRecordConverter.class))) + .thenReturn(dataFileLoader); + doNothing().when(dataFileLoader).run(); + objectUnderTest.run(); + } + }); + await().atMost(Duration.ofSeconds(1)) + .untilAsserted(() -> verify(sourceCoordinator).completePartition(dataFilePartition)); + executorService.shutdownNow(); + + verify(sourceCoordinator).completePartition(dataFilePartition); + } + + @Test + void test_shutdown() { + DataFileScheduler objectUnderTest = createObjectUnderTest(); + final ExecutorService executorService = Executors.newSingleThreadExecutor(); + executorService.submit(objectUnderTest); + + objectUnderTest.shutdown(); + + verifyNoMoreInteractions(sourceCoordinator); + executorService.shutdownNow(); + } + + private DataFileScheduler createObjectUnderTest() { + return new DataFileScheduler(sourceCoordinator, sourceConfig, s3Client, eventFactory, buffer); + } +} \ No newline at end of file diff --git a/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/export/ExportSchedulerTest.java b/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/export/ExportSchedulerTest.java new file mode 100644 index 0000000000..32aff02a57 --- /dev/null +++ b/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/export/ExportSchedulerTest.java @@ -0,0 +1,207 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.export; + + +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.Answers; +import org.mockito.Mock; +import org.mockito.Mockito; +import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.dataprepper.metrics.PluginMetrics; +import org.opensearch.dataprepper.model.source.coordinator.enhanced.EnhancedSourceCoordinator; +import org.opensearch.dataprepper.plugins.source.rds.coordination.partition.DataFilePartition; +import org.opensearch.dataprepper.plugins.source.rds.coordination.partition.ExportPartition; +import org.opensearch.dataprepper.plugins.source.rds.coordination.state.ExportProgressState; +import software.amazon.awssdk.services.rds.RdsClient; +import software.amazon.awssdk.services.rds.model.CreateDbSnapshotRequest; +import software.amazon.awssdk.services.rds.model.CreateDbSnapshotResponse; +import software.amazon.awssdk.services.rds.model.DBSnapshot; +import software.amazon.awssdk.services.rds.model.DescribeDbSnapshotsRequest; +import software.amazon.awssdk.services.rds.model.DescribeDbSnapshotsResponse; +import software.amazon.awssdk.services.rds.model.DescribeExportTasksRequest; +import software.amazon.awssdk.services.rds.model.DescribeExportTasksResponse; +import software.amazon.awssdk.services.rds.model.StartExportTaskRequest; +import software.amazon.awssdk.services.rds.model.StartExportTaskResponse; +import software.amazon.awssdk.services.s3.S3Client; +import software.amazon.awssdk.services.s3.model.ListObjectsV2Request; +import software.amazon.awssdk.services.s3.model.ListObjectsV2Response; +import software.amazon.awssdk.services.s3.model.S3Object; + +import java.time.Duration; +import java.time.Instant; +import java.util.List; +import java.util.Optional; +import java.util.UUID; +import java.util.concurrent.ExecutorService; +import java.util.concurrent.Executors; + +import static org.awaitility.Awaitility.await; +import static org.mockito.ArgumentMatchers.any; +import static org.mockito.Mockito.lenient; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.never; +import static org.mockito.Mockito.verify; +import static org.mockito.Mockito.verifyNoInteractions; +import static org.mockito.Mockito.verifyNoMoreInteractions; +import static org.mockito.Mockito.when; +import static org.opensearch.dataprepper.plugins.source.rds.export.ExportScheduler.PARQUET_SUFFIX; + + +@ExtendWith(MockitoExtension.class) +class ExportSchedulerTest { + + @Mock + private EnhancedSourceCoordinator sourceCoordinator; + + @Mock + private RdsClient rdsClient; + + @Mock + private S3Client s3Client; + + @Mock + private PluginMetrics pluginMetrics; + + @Mock + private ExportPartition exportPartition; + + @Mock(answer = Answers.RETURNS_DEFAULTS) + private ExportProgressState exportProgressState; + + private ExportScheduler exportScheduler; + + @BeforeEach + void setUp() { + exportScheduler = createObjectUnderTest(); + } + + @Test + void test_given_no_export_partition_then_not_export() throws InterruptedException { + when(sourceCoordinator.acquireAvailablePartition(ExportPartition.PARTITION_TYPE)).thenReturn(Optional.empty()); + + final ExecutorService executorService = Executors.newSingleThreadExecutor(); + executorService.submit(exportScheduler); + await().atMost(Duration.ofSeconds(1)) + .untilAsserted(() -> verify(sourceCoordinator).acquireAvailablePartition(ExportPartition.PARTITION_TYPE)); + Thread.sleep(100); + executorService.shutdownNow(); + + verifyNoInteractions(rdsClient); + } + + @Test + void test_given_export_partition_and_task_id_then_complete_export() throws InterruptedException { + when(sourceCoordinator.acquireAvailablePartition(ExportPartition.PARTITION_TYPE)).thenReturn(Optional.of(exportPartition)); + when(exportPartition.getPartitionKey()).thenReturn(UUID.randomUUID().toString()); + when(exportProgressState.getExportTaskId()).thenReturn(UUID.randomUUID().toString()); + when(exportPartition.getProgressState()).thenReturn(Optional.of(exportProgressState)); + + DescribeExportTasksResponse describeExportTasksResponse = mock(DescribeExportTasksResponse.class, Mockito.RETURNS_DEEP_STUBS); + when(describeExportTasksResponse.exportTasks().get(0).status()).thenReturn("COMPLETE"); + when(rdsClient.describeExportTasks(any(DescribeExportTasksRequest.class))).thenReturn(describeExportTasksResponse); + + // Mock list s3 objects response + ListObjectsV2Response listObjectsV2Response = mock(ListObjectsV2Response.class); + String exportTaskId = UUID.randomUUID().toString(); + String tableName = UUID.randomUUID().toString(); + // objectKey needs to have this structure: "{prefix}/{export task ID}/{database name}/{table name}/{numbered folder}/{file name}" + S3Object s3Object = S3Object.builder() + .key("prefix/" + exportTaskId + "/my_db/" + tableName + "/1/file1" + PARQUET_SUFFIX) + .build(); + when(listObjectsV2Response.contents()).thenReturn(List.of(s3Object)); + when(listObjectsV2Response.isTruncated()).thenReturn(false); + when(s3Client.listObjectsV2(any(ListObjectsV2Request.class))).thenReturn(listObjectsV2Response); + + final ExecutorService executorService = Executors.newSingleThreadExecutor(); + executorService.submit(exportScheduler); + await().atMost(Duration.ofSeconds(1)) + .untilAsserted(() -> verify(sourceCoordinator).acquireAvailablePartition(ExportPartition.PARTITION_TYPE)); + Thread.sleep(100); + executorService.shutdownNow(); + + verify(sourceCoordinator).createPartition(any(DataFilePartition.class)); + verify(sourceCoordinator).completePartition(exportPartition); + verify(rdsClient, never()).startExportTask(any(StartExportTaskRequest.class)); + verify(rdsClient, never()).createDBSnapshot(any(CreateDbSnapshotRequest.class)); + } + + + @Test + void test_given_export_partition_without_task_id_then_start_and_complete_export() throws InterruptedException { + when(sourceCoordinator.acquireAvailablePartition(ExportPartition.PARTITION_TYPE)).thenReturn(Optional.of(exportPartition)); + when(exportPartition.getPartitionKey()).thenReturn(UUID.randomUUID().toString()); + when(exportProgressState.getExportTaskId()).thenReturn(null).thenReturn(UUID.randomUUID().toString()); + when(exportPartition.getProgressState()).thenReturn(Optional.of(exportProgressState)); + final String dbIdentifier = UUID.randomUUID().toString(); + when(exportPartition.getDbIdentifier()).thenReturn(dbIdentifier); + + // Mock snapshot response + CreateDbSnapshotResponse createDbSnapshotResponse = mock(CreateDbSnapshotResponse.class); + DBSnapshot dbSnapshot = mock(DBSnapshot.class); + final String snapshotArn = "arn:aws:rds:us-east-1:123456789012:snapshot:snapshot-0b5ae174"; + when(dbSnapshot.dbSnapshotArn()).thenReturn(snapshotArn); + when(dbSnapshot.status()).thenReturn("creating").thenReturn("available"); + when(dbSnapshot.snapshotCreateTime()).thenReturn(Instant.now()); + when(createDbSnapshotResponse.dbSnapshot()).thenReturn(dbSnapshot); + when(rdsClient.createDBSnapshot(any(CreateDbSnapshotRequest.class))).thenReturn(createDbSnapshotResponse); + + DescribeDbSnapshotsResponse describeDbSnapshotsResponse = DescribeDbSnapshotsResponse.builder() + .dbSnapshots(dbSnapshot) + .build(); + when(rdsClient.describeDBSnapshots(any(DescribeDbSnapshotsRequest.class))).thenReturn(describeDbSnapshotsResponse); + + // Mock export response + StartExportTaskResponse startExportTaskResponse = mock(StartExportTaskResponse.class); + when(startExportTaskResponse.status()).thenReturn("STARTING"); + when(rdsClient.startExportTask(any(StartExportTaskRequest.class))).thenReturn(startExportTaskResponse); + + DescribeExportTasksResponse describeExportTasksResponse = mock(DescribeExportTasksResponse.class, Mockito.RETURNS_DEEP_STUBS); + when(describeExportTasksResponse.exportTasks().get(0).status()).thenReturn("COMPLETE"); + when(rdsClient.describeExportTasks(any(DescribeExportTasksRequest.class))).thenReturn(describeExportTasksResponse); + + // Mock list s3 objects response + ListObjectsV2Response listObjectsV2Response = mock(ListObjectsV2Response.class); + String exportTaskId = UUID.randomUUID().toString(); + String tableName = UUID.randomUUID().toString(); + // objectKey needs to have this structure: "{prefix}/{export task ID}/{database name}/{table name}/{numbered folder}/{file name}" + S3Object s3Object = S3Object.builder() + .key("prefix/" + exportTaskId + "/my_db/" + tableName + "/1/file1" + PARQUET_SUFFIX) + .build(); + when(listObjectsV2Response.contents()).thenReturn(List.of(s3Object)); + when(listObjectsV2Response.isTruncated()).thenReturn(false); + when(s3Client.listObjectsV2(any(ListObjectsV2Request.class))).thenReturn(listObjectsV2Response); + + final ExecutorService executorService = Executors.newSingleThreadExecutor(); + executorService.submit(exportScheduler); + await().atMost(Duration.ofSeconds(1)) + .untilAsserted(() -> verify(sourceCoordinator).acquireAvailablePartition(ExportPartition.PARTITION_TYPE)); + Thread.sleep(200); + executorService.shutdownNow(); + + verify(rdsClient).createDBSnapshot(any(CreateDbSnapshotRequest.class)); + verify(rdsClient).startExportTask(any(StartExportTaskRequest.class)); + verify(sourceCoordinator).createPartition(any(DataFilePartition.class)); + verify(sourceCoordinator).completePartition(exportPartition); + } + + @Test + void test_shutDown() { + lenient().when(sourceCoordinator.acquireAvailablePartition(ExportPartition.PARTITION_TYPE)).thenReturn(Optional.empty()); + + final ExecutorService executorService = Executors.newSingleThreadExecutor(); + executorService.submit(exportScheduler); + exportScheduler.shutdown(); + verifyNoMoreInteractions(sourceCoordinator, rdsClient); + executorService.shutdownNow(); + } + + private ExportScheduler createObjectUnderTest() { + return new ExportScheduler(sourceCoordinator, rdsClient, s3Client, pluginMetrics); + } +} \ No newline at end of file diff --git a/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/export/ExportTaskManagerTest.java b/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/export/ExportTaskManagerTest.java new file mode 100644 index 0000000000..15a23277c7 --- /dev/null +++ b/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/export/ExportTaskManagerTest.java @@ -0,0 +1,104 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.export; + +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.junit.jupiter.params.ParameterizedTest; +import org.junit.jupiter.params.provider.Arguments; +import org.junit.jupiter.params.provider.MethodSource; +import org.mockito.ArgumentCaptor; +import org.mockito.Mock; +import org.mockito.junit.jupiter.MockitoExtension; +import software.amazon.awssdk.services.rds.RdsClient; +import software.amazon.awssdk.services.rds.model.DescribeExportTasksRequest; +import software.amazon.awssdk.services.rds.model.DescribeExportTasksResponse; +import software.amazon.awssdk.services.rds.model.ExportTask; +import software.amazon.awssdk.services.rds.model.StartExportTaskRequest; + +import java.util.List; +import java.util.UUID; +import java.util.stream.Stream; + +import static org.hamcrest.MatcherAssert.assertThat; +import static org.hamcrest.Matchers.equalTo; +import static org.mockito.ArgumentMatchers.any; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.verify; +import static org.mockito.Mockito.when; + + +@ExtendWith(MockitoExtension.class) +class ExportTaskManagerTest { + + @Mock + private RdsClient rdsClient; + + private ExportTaskManager exportTaskManager; + + @BeforeEach + void setUp() { + exportTaskManager = createObjectUnderTest(); + } + + @ParameterizedTest + @MethodSource("provideStartExportTaskTestParameters") + void test_start_export_task(List exportOnly) { + final String snapshotArn = "arn:aws:rds:us-east-1:123456789012:snapshot:" + UUID.randomUUID(); + final String iamRoleArn = "arn:aws:iam:us-east-1:123456789012:role:" + UUID.randomUUID(); + final String bucket = UUID.randomUUID().toString(); + final String prefix = UUID.randomUUID().toString(); + final String kmsKey = UUID.randomUUID().toString(); + + exportTaskManager.startExportTask(snapshotArn, iamRoleArn, bucket, prefix, kmsKey, exportOnly); + + final ArgumentCaptor exportTaskRequestArgumentCaptor = + ArgumentCaptor.forClass(StartExportTaskRequest.class); + + verify(rdsClient).startExportTask(exportTaskRequestArgumentCaptor.capture()); + + final StartExportTaskRequest actualRequest = exportTaskRequestArgumentCaptor.getValue(); + assertThat(actualRequest.sourceArn(), equalTo(snapshotArn)); + assertThat(actualRequest.iamRoleArn(), equalTo(iamRoleArn)); + assertThat(actualRequest.s3BucketName(), equalTo(bucket)); + assertThat(actualRequest.s3Prefix(), equalTo(prefix)); + assertThat(actualRequest.kmsKeyId(), equalTo(kmsKey)); + assertThat(actualRequest.exportOnly(), equalTo(exportOnly)); + } + + @Test + void test_check_export_status() { + final String exportTaskId = UUID.randomUUID().toString(); + DescribeExportTasksResponse describeExportTasksResponse = mock(DescribeExportTasksResponse.class); + when(describeExportTasksResponse.exportTasks()).thenReturn(List.of(ExportTask.builder().status("COMPLETE").build())); + when(rdsClient.describeExportTasks(any(DescribeExportTasksRequest.class))).thenReturn(describeExportTasksResponse); + + exportTaskManager.checkExportStatus(exportTaskId); + + final ArgumentCaptor exportTaskRequestArgumentCaptor = + ArgumentCaptor.forClass(DescribeExportTasksRequest.class); + + verify(rdsClient).describeExportTasks(exportTaskRequestArgumentCaptor.capture()); + + final DescribeExportTasksRequest actualRequest = exportTaskRequestArgumentCaptor.getValue(); + assertThat(actualRequest.exportTaskIdentifier(), equalTo(exportTaskId)); + } + + private static Stream provideStartExportTaskTestParameters() { + final String tableName1 = UUID.randomUUID().toString(); + final String tableName2 = UUID.randomUUID().toString(); + return Stream.of( + Arguments.of(List.of()), + Arguments.of(List.of(tableName1)), + Arguments.of(List.of(tableName1, tableName2)) + ); + } + + private ExportTaskManager createObjectUnderTest() { + return new ExportTaskManager(rdsClient); + } +} \ No newline at end of file diff --git a/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/export/S3ObjectReaderTest.java b/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/export/S3ObjectReaderTest.java new file mode 100644 index 0000000000..44aa22f6ad --- /dev/null +++ b/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/export/S3ObjectReaderTest.java @@ -0,0 +1,56 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.export; + +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.ArgumentCaptor; +import org.mockito.Mock; +import org.mockito.junit.jupiter.MockitoExtension; +import software.amazon.awssdk.services.s3.S3Client; +import software.amazon.awssdk.services.s3.model.GetObjectRequest; + +import java.util.UUID; + +import static org.hamcrest.MatcherAssert.assertThat; +import static org.hamcrest.Matchers.equalTo; +import static org.mockito.Mockito.verify; + +@ExtendWith(MockitoExtension.class) +class S3ObjectReaderTest { + + @Mock + private S3Client s3Client; + + private S3ObjectReader s3ObjectReader; + + + @BeforeEach + void setUp() { + s3ObjectReader = createObjectUnderTest(); + } + + @Test + void test_readFile() { + final String bucketName = UUID.randomUUID().toString(); + final String key = UUID.randomUUID().toString(); + + + s3ObjectReader.readFile(bucketName, key); + + ArgumentCaptor getObjectRequestArgumentCaptor = ArgumentCaptor.forClass(GetObjectRequest.class); + verify(s3Client).getObject(getObjectRequestArgumentCaptor.capture()); + + GetObjectRequest request = getObjectRequestArgumentCaptor.getValue(); + assertThat(request.bucket(), equalTo(bucketName)); + assertThat(request.key(), equalTo(key)); + } + + private S3ObjectReader createObjectUnderTest() { + return new S3ObjectReader(s3Client); + } +} diff --git a/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/export/SnapshotManagerTest.java b/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/export/SnapshotManagerTest.java new file mode 100644 index 0000000000..bca52a5fdd --- /dev/null +++ b/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/export/SnapshotManagerTest.java @@ -0,0 +1,115 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.export; + +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.ArgumentCaptor; +import org.mockito.Mock; +import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.dataprepper.plugins.source.rds.model.SnapshotInfo; +import software.amazon.awssdk.services.rds.RdsClient; +import software.amazon.awssdk.services.rds.model.CreateDbSnapshotRequest; +import software.amazon.awssdk.services.rds.model.CreateDbSnapshotResponse; +import software.amazon.awssdk.services.rds.model.DBSnapshot; +import software.amazon.awssdk.services.rds.model.DescribeDbSnapshotsRequest; +import software.amazon.awssdk.services.rds.model.DescribeDbSnapshotsResponse; + +import java.time.Instant; +import java.util.List; +import java.util.UUID; + +import static org.hamcrest.MatcherAssert.assertThat; +import static org.hamcrest.Matchers.equalTo; +import static org.hamcrest.Matchers.notNullValue; +import static org.mockito.ArgumentMatchers.any; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.verify; +import static org.mockito.Mockito.when; + +@ExtendWith(MockitoExtension.class) +class SnapshotManagerTest { + + @Mock + private RdsClient rdsClient; + + private SnapshotManager snapshotManager; + + @BeforeEach + void setUp() { + snapshotManager = createObjectUnderTest(); + } + + @Test + void test_create_snapshot_with_success() { + String dbInstanceId = UUID.randomUUID().toString(); + CreateDbSnapshotResponse createDbSnapshotResponse = mock(CreateDbSnapshotResponse.class); + DBSnapshot dbSnapshot = mock(DBSnapshot.class); + final String snapshotArn = "arn:aws:rds:us-east-1:123456789012:snapshot:snapshot-0b5ae174"; + final String status = "creating"; + final Instant createTime = Instant.now(); + when(dbSnapshot.dbSnapshotArn()).thenReturn(snapshotArn); + when(dbSnapshot.status()).thenReturn(status); + when(dbSnapshot.snapshotCreateTime()).thenReturn(createTime); + when(createDbSnapshotResponse.dbSnapshot()).thenReturn(dbSnapshot); + when(rdsClient.createDBSnapshot(any(CreateDbSnapshotRequest.class))).thenReturn(createDbSnapshotResponse); + + SnapshotInfo snapshotInfo = snapshotManager.createSnapshot(dbInstanceId); + + ArgumentCaptor argumentCaptor = ArgumentCaptor.forClass(CreateDbSnapshotRequest.class); + verify(rdsClient).createDBSnapshot(argumentCaptor.capture()); + + CreateDbSnapshotRequest request = argumentCaptor.getValue(); + assertThat(request.dbInstanceIdentifier(), equalTo(dbInstanceId)); + + assertThat(snapshotInfo, notNullValue()); + assertThat(snapshotInfo.getSnapshotArn(), equalTo(snapshotArn)); + assertThat(snapshotInfo.getStatus(), equalTo(status)); + assertThat(snapshotInfo.getCreateTime(), equalTo(createTime)); + } + + @Test + void test_create_snapshot_throws_exception_then_returns_null() { + String dbInstanceId = UUID.randomUUID().toString(); + when(rdsClient.createDBSnapshot(any(CreateDbSnapshotRequest.class))).thenThrow(new RuntimeException("Error")); + + SnapshotInfo snapshotInfo = snapshotManager.createSnapshot(dbInstanceId); + + assertThat(snapshotInfo, equalTo(null)); + } + + @Test + void test_check_snapshot_status_returns_correct_result() { + DBSnapshot dbSnapshot = mock(DBSnapshot.class); + final String snapshotArn = "arn:aws:rds:us-east-1:123456789012:snapshot:snapshot-0b5ae174"; + final String status = "creating"; + final Instant createTime = Instant.now(); + when(dbSnapshot.dbSnapshotArn()).thenReturn(snapshotArn); + when(dbSnapshot.status()).thenReturn(status); + when(dbSnapshot.snapshotCreateTime()).thenReturn(createTime); + DescribeDbSnapshotsResponse describeDbSnapshotsResponse = mock(DescribeDbSnapshotsResponse.class); + when(describeDbSnapshotsResponse.dbSnapshots()).thenReturn(List.of(dbSnapshot)); + + final String snapshotId = UUID.randomUUID().toString(); + DescribeDbSnapshotsRequest describeDbSnapshotsRequest = DescribeDbSnapshotsRequest.builder() + .dbSnapshotIdentifier(snapshotId) + .build(); + when(rdsClient.describeDBSnapshots(describeDbSnapshotsRequest)).thenReturn(describeDbSnapshotsResponse); + + SnapshotInfo snapshotInfo = snapshotManager.checkSnapshotStatus(snapshotId); + + assertThat(snapshotInfo, notNullValue()); + assertThat(snapshotInfo.getSnapshotId(), equalTo(snapshotId)); + assertThat(snapshotInfo.getSnapshotArn(), equalTo(snapshotArn)); + assertThat(snapshotInfo.getStatus(), equalTo(status)); + assertThat(snapshotInfo.getCreateTime(), equalTo(createTime)); + } + + private SnapshotManager createObjectUnderTest() { + return new SnapshotManager(rdsClient); + } +} \ No newline at end of file diff --git a/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/leader/LeaderSchedulerTest.java b/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/leader/LeaderSchedulerTest.java new file mode 100644 index 0000000000..e844cc0ff4 --- /dev/null +++ b/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/leader/LeaderSchedulerTest.java @@ -0,0 +1,135 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.leader; + +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.Answers; +import org.mockito.Mock; +import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.dataprepper.model.source.coordinator.enhanced.EnhancedSourceCoordinator; +import org.opensearch.dataprepper.plugins.source.rds.RdsSourceConfig; +import org.opensearch.dataprepper.plugins.source.rds.configuration.AwsAuthenticationConfig; +import org.opensearch.dataprepper.plugins.source.rds.configuration.ExportConfig; +import org.opensearch.dataprepper.plugins.source.rds.coordination.partition.ExportPartition; +import org.opensearch.dataprepper.plugins.source.rds.coordination.partition.GlobalState; +import org.opensearch.dataprepper.plugins.source.rds.coordination.partition.LeaderPartition; +import org.opensearch.dataprepper.plugins.source.rds.coordination.state.LeaderProgressState; + +import java.time.Duration; +import java.util.Optional; +import java.util.UUID; +import java.util.concurrent.ExecutorService; +import java.util.concurrent.Executors; + +import static org.awaitility.Awaitility.await; +import static org.mockito.ArgumentMatchers.any; +import static org.mockito.ArgumentMatchers.eq; +import static org.mockito.Mockito.lenient; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.never; +import static org.mockito.Mockito.verify; +import static org.mockito.Mockito.verifyNoMoreInteractions; +import static org.mockito.Mockito.when; + + +@ExtendWith(MockitoExtension.class) +class LeaderSchedulerTest { + + @Mock + private EnhancedSourceCoordinator sourceCoordinator; + + @Mock(answer = Answers.RETURNS_DEFAULTS) + private RdsSourceConfig sourceConfig; + + @Mock + private LeaderPartition leaderPartition; + + @Mock + private LeaderProgressState leaderProgressState; + + private LeaderScheduler leaderScheduler; + + @BeforeEach + void setUp() { + leaderScheduler = createObjectUnderTest(); + + AwsAuthenticationConfig awsAuthenticationConfig = mock(AwsAuthenticationConfig.class); + lenient().when(awsAuthenticationConfig.getAwsStsRoleArn()).thenReturn(UUID.randomUUID().toString()); + lenient().when(sourceConfig.getAwsAuthenticationConfig()).thenReturn(awsAuthenticationConfig); + ExportConfig exportConfig = mock(ExportConfig.class); + lenient().when(exportConfig.getKmsKeyId()).thenReturn(UUID.randomUUID().toString()); + lenient().when(sourceConfig.getExport()).thenReturn(exportConfig); + } + + @Test + void non_leader_node_should_not_perform_init() throws InterruptedException { + when(sourceCoordinator.acquireAvailablePartition(LeaderPartition.PARTITION_TYPE)).thenReturn(Optional.empty()); + + final ExecutorService executorService = Executors.newSingleThreadExecutor(); + executorService.submit(leaderScheduler); + await().atMost(Duration.ofSeconds(1)) + .untilAsserted(() -> verify(sourceCoordinator).acquireAvailablePartition(LeaderPartition.PARTITION_TYPE)); + Thread.sleep(100); + executorService.shutdownNow(); + + verify(sourceCoordinator, never()).createPartition(any(GlobalState.class)); + verify(sourceCoordinator, never()).createPartition(any(ExportPartition.class)); + } + + @Test + void leader_node_should_perform_init_if_not_initialized() throws InterruptedException { + when(sourceCoordinator.acquireAvailablePartition(LeaderPartition.PARTITION_TYPE)).thenReturn(Optional.of(leaderPartition)); + when(leaderPartition.getProgressState()).thenReturn(Optional.of(leaderProgressState)); + when(leaderProgressState.isInitialized()).thenReturn(false); + when(sourceConfig.isExportEnabled()).thenReturn(true); + + final ExecutorService executorService = Executors.newSingleThreadExecutor(); + executorService.submit(leaderScheduler); + await().atMost(Duration.ofSeconds(1)) + .untilAsserted(() -> verify(sourceCoordinator).acquireAvailablePartition(LeaderPartition.PARTITION_TYPE)); + Thread.sleep(100); + executorService.shutdownNow(); + + verify(sourceCoordinator).createPartition(any(GlobalState.class)); + verify(sourceCoordinator).createPartition(any(ExportPartition.class)); + verify(sourceCoordinator).saveProgressStateForPartition(eq(leaderPartition), any(Duration.class)); + } + + @Test + void leader_node_should_skip_init_if_initialized() throws InterruptedException { + when(sourceCoordinator.acquireAvailablePartition(LeaderPartition.PARTITION_TYPE)).thenReturn(Optional.of(leaderPartition)); + when(leaderPartition.getProgressState()).thenReturn(Optional.of(leaderProgressState)); + when(leaderProgressState.isInitialized()).thenReturn(true); + + final ExecutorService executorService = Executors.newSingleThreadExecutor(); + executorService.submit(leaderScheduler); + await().atMost(Duration.ofSeconds(1)) + .untilAsserted(() -> verify(sourceCoordinator).acquireAvailablePartition(LeaderPartition.PARTITION_TYPE)); + Thread.sleep(100); + executorService.shutdownNow(); + + verify(sourceCoordinator, never()).createPartition(any(GlobalState.class)); + verify(sourceCoordinator, never()).createPartition(any(ExportPartition.class)); + verify(sourceCoordinator).saveProgressStateForPartition(eq(leaderPartition), any(Duration.class)); + } + + @Test + void test_shutDown() { + lenient().when(sourceCoordinator.acquireAvailablePartition(LeaderPartition.PARTITION_TYPE)).thenReturn(Optional.empty()); + + final ExecutorService executorService = Executors.newSingleThreadExecutor(); + executorService.submit(leaderScheduler); + leaderScheduler.shutdown(); + verifyNoMoreInteractions(sourceCoordinator); + executorService.shutdownNow(); + } + + private LeaderScheduler createObjectUnderTest() { + return new LeaderScheduler(sourceCoordinator, sourceConfig); + } +} \ No newline at end of file diff --git a/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/model/ExportObjectKeyTest.java b/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/model/ExportObjectKeyTest.java new file mode 100644 index 0000000000..7056114572 --- /dev/null +++ b/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/model/ExportObjectKeyTest.java @@ -0,0 +1,37 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.model; + +import org.junit.jupiter.api.Test; + +import static org.hamcrest.MatcherAssert.assertThat; +import static org.hamcrest.Matchers.containsString; +import static org.hamcrest.Matchers.equalTo; +import static org.junit.jupiter.api.Assertions.assertThrows; + +class ExportObjectKeyTest { + + @Test + void test_fromString_with_valid_input_string() { + final String objectKeyString = "prefix/export-task-id/db-name/table-name/1/file-name.parquet"; + final ExportObjectKey exportObjectKey = ExportObjectKey.fromString(objectKeyString); + + assertThat(exportObjectKey.getPrefix(), equalTo("prefix")); + assertThat(exportObjectKey.getExportTaskId(), equalTo("export-task-id")); + assertThat(exportObjectKey.getDatabaseName(), equalTo("db-name")); + assertThat(exportObjectKey.getTableName(), equalTo("table-name")); + assertThat(exportObjectKey.getNumberedFolder(), equalTo("1")); + assertThat(exportObjectKey.getFileName(), equalTo("file-name.parquet")); + } + + @Test + void test_fromString_with_invalid_input_string() { + final String objectKeyString = "prefix/export-task-id/db-name/table-name/1/"; + + Throwable exception = assertThrows(IllegalArgumentException.class, () -> ExportObjectKey.fromString(objectKeyString)); + assertThat(exception.getMessage(), containsString("Export object key is not valid: " + objectKeyString)); + } +} \ No newline at end of file diff --git a/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/model/ExportStatusTest.java b/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/model/ExportStatusTest.java new file mode 100644 index 0000000000..16ef0c0a1b --- /dev/null +++ b/data-prepper-plugins/rds-source/src/test/java/org/opensearch/dataprepper/plugins/source/rds/model/ExportStatusTest.java @@ -0,0 +1,49 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.rds.model; + +import org.junit.jupiter.api.extension.ExtensionContext; +import org.junit.jupiter.params.ParameterizedTest; +import org.junit.jupiter.params.provider.Arguments; +import org.junit.jupiter.params.provider.ArgumentsProvider; +import org.junit.jupiter.params.provider.ArgumentsSource; +import org.junit.jupiter.params.provider.EnumSource; + +import java.util.stream.Stream; + +import static org.hamcrest.CoreMatchers.equalTo; +import static org.hamcrest.MatcherAssert.assertThat; + +class ExportStatusTest { + + @ParameterizedTest + @EnumSource(ExportStatus.class) + void fromString_returns_expected_value(final ExportStatus status) { + assertThat(ExportStatus.fromString(status.name()), equalTo(status)); + } + + @ParameterizedTest + @ArgumentsSource(ProvideTerminalStatusTestData.class) + void test_is_terminal_returns_expected_result(final String status, final boolean expected_result) { + assertThat(ExportStatus.isTerminal(status), equalTo(expected_result)); + } + + static class ProvideTerminalStatusTestData implements ArgumentsProvider { + @Override + public Stream provideArguments(ExtensionContext context) { + return Stream.of( + Arguments.of("COMPLETE", true), + Arguments.of("CANCELED", true), + Arguments.of("FAILED", true), + Arguments.of("CANCELING", false), + Arguments.of("IN_PROGRESS", false), + Arguments.of("STARTING", false), + Arguments.of("INVALID_STATUS", false), + Arguments.of(null, false) + ); + } + } +} diff --git a/data-prepper-plugins/rss-source/build.gradle b/data-prepper-plugins/rss-source/build.gradle index 68c0ff9e57..686e40367b 100644 --- a/data-prepper-plugins/rss-source/build.gradle +++ b/data-prepper-plugins/rss-source/build.gradle @@ -13,7 +13,7 @@ dependencies { implementation 'joda-time:joda-time:2.12.7' implementation 'com.fasterxml.jackson.core:jackson-core' implementation 'com.fasterxml.jackson.core:jackson-databind' - implementation 'com.apptasticsoftware:rssreader:3.6.0' + implementation 'com.apptasticsoftware:rssreader:3.7.0' testImplementation libs.commons.lang3 testImplementation project(':data-prepper-test-common') testImplementation 'org.mock-server:mockserver-junit-jupiter-no-dependencies:5.15.0' diff --git a/data-prepper-plugins/s3-sink/build.gradle b/data-prepper-plugins/s3-sink/build.gradle index 5a6b174900..4ea0a364fd 100644 --- a/data-prepper-plugins/s3-sink/build.gradle +++ b/data-prepper-plugins/s3-sink/build.gradle @@ -19,8 +19,12 @@ dependencies { implementation 'org.jetbrains.kotlin:kotlin-stdlib:1.9.22' implementation project(':data-prepper-plugins:avro-codecs') implementation libs.avro.core - implementation libs.hadoop.common - implementation 'org.apache.parquet:parquet-avro:1.14.0' + implementation(libs.hadoop.common) { + exclude group: 'org.eclipse.jetty' + exclude group: 'org.apache.hadoop', module: 'hadoop-auth' + exclude group: 'org.apache.zookeeper', module: 'zookeeper' + } + implementation libs.parquet.avro implementation 'software.amazon.awssdk:apache-client' implementation 'org.jetbrains.kotlin:kotlin-stdlib-common:1.9.22' implementation libs.commons.lang3 diff --git a/data-prepper-plugins/s3-sink/src/integrationTest/java/org/opensearch/dataprepper/plugins/sink/s3/ParquetOutputScenario.java b/data-prepper-plugins/s3-sink/src/integrationTest/java/org/opensearch/dataprepper/plugins/sink/s3/ParquetOutputScenario.java index 6e3abc3250..e01c61fe09 100644 --- a/data-prepper-plugins/s3-sink/src/integrationTest/java/org/opensearch/dataprepper/plugins/sink/s3/ParquetOutputScenario.java +++ b/data-prepper-plugins/s3-sink/src/integrationTest/java/org/opensearch/dataprepper/plugins/sink/s3/ParquetOutputScenario.java @@ -9,6 +9,7 @@ import org.apache.avro.util.Utf8; import org.apache.parquet.ParquetReadOptions; import org.apache.parquet.avro.AvroParquetReader; +import org.apache.parquet.conf.PlainParquetConfiguration; import org.apache.parquet.hadoop.ParquetFileReader; import org.apache.parquet.hadoop.ParquetReader; import org.apache.parquet.hadoop.metadata.BlockMetaData; @@ -65,7 +66,7 @@ public void validate(int expectedRecords, final List> sample int validatedRecords = 0; int count = 0; - try (final ParquetReader reader = AvroParquetReader.builder(inputFile) + try (final ParquetReader reader = AvroParquetReader.builder(inputFile, new PlainParquetConfiguration()) .build()) { GenericRecord record; diff --git a/data-prepper-plugins/s3-sink/src/integrationTest/java/org/opensearch/dataprepper/plugins/sink/s3/S3SinkServiceIT.java b/data-prepper-plugins/s3-sink/src/integrationTest/java/org/opensearch/dataprepper/plugins/sink/s3/S3SinkServiceIT.java index 739ac876df..b7bbb1b97d 100644 --- a/data-prepper-plugins/s3-sink/src/integrationTest/java/org/opensearch/dataprepper/plugins/sink/s3/S3SinkServiceIT.java +++ b/data-prepper-plugins/s3-sink/src/integrationTest/java/org/opensearch/dataprepper/plugins/sink/s3/S3SinkServiceIT.java @@ -12,8 +12,6 @@ import org.apache.avro.Schema; import org.apache.avro.SchemaBuilder; import org.apache.commons.io.IOUtils; -import org.apache.hadoop.conf.Configuration; -import org.apache.hadoop.fs.Path; import org.apache.parquet.ParquetReadOptions; import org.apache.parquet.column.page.PageReadStore; import org.apache.parquet.example.data.Group; @@ -21,8 +19,8 @@ import org.apache.parquet.example.data.simple.convert.GroupRecordConverter; import org.apache.parquet.hadoop.ParquetFileReader; import org.apache.parquet.hadoop.metadata.ParquetMetadata; -import org.apache.parquet.hadoop.util.HadoopInputFile; import org.apache.parquet.io.ColumnIOFactory; +import org.apache.parquet.io.LocalInputFile; import org.apache.parquet.io.MessageColumnIO; import org.apache.parquet.io.RecordReader; import org.apache.parquet.schema.MessageType; @@ -79,6 +77,7 @@ import java.io.InputStream; import java.nio.charset.Charset; import java.nio.file.Files; +import java.nio.file.Path; import java.nio.file.StandardCopyOption; import java.time.Duration; import java.util.ArrayList; @@ -413,7 +412,7 @@ private List> createParquetRecordsList(final InputStream final File tempFile = File.createTempFile(FILE_NAME, FILE_SUFFIX); Files.copy(inputStream, tempFile.toPath(), StandardCopyOption.REPLACE_EXISTING); List> actualRecordList = new ArrayList<>(); - try (ParquetFileReader parquetFileReader = new ParquetFileReader(HadoopInputFile.fromPath(new Path(tempFile.toURI()), new Configuration()), ParquetReadOptions.builder().build())) { + try (final ParquetFileReader parquetFileReader = new ParquetFileReader(new LocalInputFile(Path.of(tempFile.toURI())), ParquetReadOptions.builder().build())) { final ParquetMetadata footer = parquetFileReader.getFooter(); final MessageType schema = createdParquetSchema(footer); PageReadStore pages; diff --git a/data-prepper-plugins/s3-sink/src/test/java/org/opensearch/dataprepper/plugins/codec/parquet/ParquetOutputCodecTest.java b/data-prepper-plugins/s3-sink/src/test/java/org/opensearch/dataprepper/plugins/codec/parquet/ParquetOutputCodecTest.java index b441a7a6e3..059b908aa4 100644 --- a/data-prepper-plugins/s3-sink/src/test/java/org/opensearch/dataprepper/plugins/codec/parquet/ParquetOutputCodecTest.java +++ b/data-prepper-plugins/s3-sink/src/test/java/org/opensearch/dataprepper/plugins/codec/parquet/ParquetOutputCodecTest.java @@ -11,6 +11,7 @@ import org.apache.parquet.ParquetReadOptions; import org.apache.parquet.column.page.PageReadStore; import org.apache.parquet.example.data.Group; +import org.apache.parquet.schema.GroupType; import org.apache.parquet.example.data.simple.SimpleGroup; import org.apache.parquet.example.data.simple.convert.GroupRecordConverter; import org.apache.parquet.hadoop.ParquetFileReader; @@ -41,10 +42,10 @@ import org.slf4j.Logger; import org.slf4j.LoggerFactory; -import java.io.ByteArrayInputStream; -import java.io.File; -import java.io.IOException; +import java.io.FileInputStream; import java.io.InputStream; +import java.io.IOException; +import java.io.File; import java.nio.file.Files; import java.nio.file.StandardCopyOption; import java.util.ArrayList; @@ -60,6 +61,7 @@ import static org.hamcrest.CoreMatchers.containsString; import static org.hamcrest.CoreMatchers.equalTo; +import static org.hamcrest.CoreMatchers.not; import static org.hamcrest.CoreMatchers.notNullValue; import static org.hamcrest.MatcherAssert.assertThat; import static org.hamcrest.Matchers.greaterThanOrEqualTo; @@ -115,11 +117,12 @@ void test_happy_case(final int numberOfRecords) throws Exception { parquetOutputCodec.writeEvent(event, outputStream); } parquetOutputCodec.complete(outputStream); - List> actualRecords = createParquetRecordsList(new ByteArrayInputStream(tempFile.toString().getBytes())); + List> actualRecords = createParquetRecordsList(new FileInputStream(tempFile)); int index = 0; + assertThat(inputMaps.size(), equalTo(actualRecords.size())); for (final Map actualMap : actualRecords) { assertThat(actualMap, notNullValue()); - Map expectedMap = generateRecords(numberOfRecords).get(index); + Map expectedMap = inputMaps.get(index); assertThat(expectedMap, Matchers.equalTo(actualMap)); index++; } @@ -142,14 +145,16 @@ void test_happy_case_nullable_records(final int numberOfRecords) throws Exceptio parquetOutputCodec.writeEvent(event, outputStream); } parquetOutputCodec.complete(outputStream); - List> actualRecords = createParquetRecordsList(new ByteArrayInputStream(tempFile.toString().getBytes())); + List> actualRecords = createParquetRecordsList(new FileInputStream(tempFile)); int index = 0; + assertThat(inputMaps.size(), equalTo(actualRecords.size())); for (final Map actualMap : actualRecords) { assertThat(actualMap, notNullValue()); - Map expectedMap = generateRecords(numberOfRecords).get(index); + Map expectedMap = inputMaps.get(index); assertThat(expectedMap, Matchers.equalTo(actualMap)); index++; } + outputStream.close(); tempFile.delete(); } @@ -168,11 +173,12 @@ void test_happy_case_nullable_records_with_empty_maps(final int numberOfRecords) parquetOutputCodec.writeEvent(event, outputStream); } parquetOutputCodec.complete(outputStream); - List> actualRecords = createParquetRecordsList(new ByteArrayInputStream(tempFile.toString().getBytes())); + List> actualRecords = createParquetRecordsList(new FileInputStream(tempFile)); int index = 0; + assertThat(inputMaps.size(), equalTo(actualRecords.size())); for (final Map actualMap : actualRecords) { assertThat(actualMap, notNullValue()); - Map expectedMap = generateRecords(numberOfRecords).get(index); + Map expectedMap = inputMaps.get(index); assertThat(expectedMap, Matchers.equalTo(actualMap)); index++; } @@ -194,6 +200,9 @@ void writeEvent_includes_record_when_field_does_not_exist_in_user_supplied_schem final Event eventWithInvalidField = mock(Event.class); final String invalidFieldName = UUID.randomUUID().toString(); Map mapWithInvalid = generateRecords(1).get(0); + Map mapWithoutInvalid = mapWithInvalid.entrySet() + .stream() + .collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue)); mapWithInvalid.put(invalidFieldName, UUID.randomUUID().toString()); when(eventWithInvalidField.toMap()).thenReturn(mapWithInvalid); final ParquetOutputCodec objectUnderTest = createObjectUnderTest(); @@ -205,12 +214,12 @@ void writeEvent_includes_record_when_field_does_not_exist_in_user_supplied_schem objectUnderTest.writeEvent(eventWithInvalidField, outputStream); objectUnderTest.complete(outputStream); - List> actualRecords = createParquetRecordsList(new ByteArrayInputStream(tempFile.toString().getBytes())); + List> actualRecords = createParquetRecordsList(new FileInputStream(tempFile)); int index = 0; for (final Map actualMap : actualRecords) { assertThat(actualMap, notNullValue()); - Map expectedMap = generateRecords(1).get(index); - assertThat(expectedMap, Matchers.equalTo(actualMap)); + assertThat(mapWithInvalid, not(Matchers.equalTo(actualMap))); + assertThat(mapWithoutInvalid, Matchers.equalTo(actualMap)); index++; } } @@ -551,7 +560,29 @@ private static Schema createStandardInnerSchemaForNestedRecord( return assembler.endRecord(); } - private List> createParquetRecordsList(final InputStream inputStream) throws IOException { + private List extractStringList(SimpleGroup group, String fieldName) { + int fieldIndex = group.getType().getFieldIndex(fieldName); + int repetitionCount = group.getGroup(fieldIndex, 0).getFieldRepetitionCount(0); + List resultList = new ArrayList<>(); + for (int i = 0; i < repetitionCount; i++) { + resultList.add(group.getGroup(fieldIndex, 0).getString(0, i)); + } + return resultList; + } + + private Map extractNestedGroup(SimpleGroup group, String fieldName) { + + Map resultMap = new HashMap<>(); + int fieldIndex = group.getType().getFieldIndex(fieldName); + int f1 = group.getGroup(fieldIndex, 0).getType().getFieldIndex("firstFieldInNestedRecord"); + resultMap.put("firstFieldInNestedRecord", group.getGroup(fieldIndex, 0).getString(f1,0)); + int f2 = group.getGroup(fieldIndex, 0).getType().getFieldIndex("secondFieldInNestedRecord"); + resultMap.put("secondFieldInNestedRecord", group.getGroup(fieldIndex, 0).getInteger(f2,0)); + + return resultMap; + } + + private List> createParquetRecordsList(final InputStream inputStream) throws IOException, RuntimeException { final File tempFile = new File(tempDirectory, FILE_NAME); Files.copy(inputStream, tempFile.toPath(), StandardCopyOption.REPLACE_EXISTING); @@ -567,15 +598,34 @@ private List> createParquetRecordsList(final InputStream inp final RecordReader recordReader = columnIO.getRecordReader(pages, new GroupRecordConverter(schema)); for (int row = 0; row < rows; row++) { final Map eventData = new HashMap<>(); - int fieldIndex = 0; final SimpleGroup simpleGroup = (SimpleGroup) recordReader.read(); + final GroupType groupType = simpleGroup.getType(); + + for (Type field : schema.getFields()) { - try { - eventData.put(field.getName(), simpleGroup.getValueToString(fieldIndex, 0)); - } catch (Exception parquetException) { - LOG.error("Failed to parse Parquet", parquetException); + Object value; + int fieldIndex = groupType.getFieldIndex(field.getName()); + if (simpleGroup.getFieldRepetitionCount(fieldIndex) == 0) { + continue; + } + switch (field.getName()) { + case "name": value = simpleGroup.getString(fieldIndex, 0); + break; + case "age": value = simpleGroup.getInteger(fieldIndex, 0); + break; + case "myLong": value = simpleGroup.getLong(fieldIndex, 0); + break; + case "myFloat": value = simpleGroup.getFloat(fieldIndex, 0); + break; + case "myDouble": value = simpleGroup.getDouble(fieldIndex, 0); + break; + case "myArray": value = extractStringList(simpleGroup, "myArray"); + break; + case "nestedRecord": value = extractNestedGroup(simpleGroup, "nestedRecord"); + break; + default: throw new IllegalArgumentException("Unknown field"); } - fieldIndex++; + eventData.put(field.getName(), value); } actualRecordList.add((HashMap) eventData); } @@ -591,4 +641,4 @@ private List> createParquetRecordsList(final InputStream inp private MessageType createdParquetSchema(ParquetMetadata parquetMetadata) { return parquetMetadata.getFileMetaData().getSchema(); } -} \ No newline at end of file +} diff --git a/data-prepper-plugins/s3-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/s3/S3SinkServiceTest.java b/data-prepper-plugins/s3-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/s3/S3SinkServiceTest.java index 88c4df5202..c1f84d2bd4 100644 --- a/data-prepper-plugins/s3-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/s3/S3SinkServiceTest.java +++ b/data-prepper-plugins/s3-sink/src/test/java/org/opensearch/dataprepper/plugins/sink/s3/S3SinkServiceTest.java @@ -565,8 +565,8 @@ void output_will_skip_and_drop_failed_records() throws IOException { DefaultEventHandle eventHandle1 = (DefaultEventHandle)event1.getEventHandle(); DefaultEventHandle eventHandle2 = (DefaultEventHandle)event2.getEventHandle(); - eventHandle1.setAcknowledgementSet(acknowledgementSet); - eventHandle2.setAcknowledgementSet(acknowledgementSet); + eventHandle1.addAcknowledgementSet(acknowledgementSet); + eventHandle2.addAcknowledgementSet(acknowledgementSet); doThrow(RuntimeException.class).when(codec).writeEvent(event1, outputStream); diff --git a/data-prepper-plugins/s3-sink/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker b/data-prepper-plugins/s3-sink/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker deleted file mode 100644 index 23c33feb6d..0000000000 --- a/data-prepper-plugins/s3-sink/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker +++ /dev/null @@ -1,3 +0,0 @@ -# To enable mocking of final classes with vanilla Mockito -# https://github.com/mockito/mockito/wiki/What%27s-new-in-Mockito-2#mock-the-unmockable-opt-in-mocking-of-final-classesmethods -mock-maker-inline diff --git a/data-prepper-plugins/s3-source/build.gradle b/data-prepper-plugins/s3-source/build.gradle index 1187fa7ec0..06818d8eaa 100644 --- a/data-prepper-plugins/s3-source/build.gradle +++ b/data-prepper-plugins/s3-source/build.gradle @@ -27,11 +27,11 @@ dependencies { implementation 'com.fasterxml.jackson.dataformat:jackson-dataformat-csv' implementation 'com.fasterxml.jackson.datatype:jackson-datatype-jsr310' implementation 'org.xerial.snappy:snappy-java:1.1.10.5' - implementation 'org.apache.parquet:parquet-common:1.14.0' + implementation libs.parquet.common implementation 'dev.failsafe:failsafe:3.3.2' implementation 'org.apache.httpcomponents:httpcore:4.4.16' testImplementation libs.commons.lang3 - testImplementation 'org.wiremock:wiremock:3.4.2' + testImplementation 'org.wiremock:wiremock:3.8.0' testImplementation 'org.eclipse.jetty:jetty-bom:11.0.20' testImplementation 'com.fasterxml.jackson.dataformat:jackson-dataformat-yaml' testImplementation testLibs.junit.vintage @@ -45,11 +45,14 @@ dependencies { testImplementation project(':data-prepper-plugins:parquet-codecs') testImplementation project(':data-prepper-test-event') testImplementation libs.avro.core - testImplementation testLibs.hadoop.common - testImplementation 'org.apache.parquet:parquet-avro:1.14.0' - testImplementation 'org.apache.parquet:parquet-column:1.14.0' - testImplementation 'org.apache.parquet:parquet-common:1.14.0' - testImplementation 'org.apache.parquet:parquet-hadoop:1.14.0' + testImplementation(libs.hadoop.common) { + exclude group: 'org.eclipse.jetty' + exclude group: 'org.apache.hadoop', module: 'hadoop-auth' + exclude group: 'org.apache.zookeeper', module: 'zookeeper' + } + testImplementation libs.parquet.avro + testImplementation libs.parquet.column + testImplementation libs.parquet.hadoop } test { diff --git a/data-prepper-plugins/s3-source/src/main/java/org/opensearch/dataprepper/plugins/source/s3/S3ScanPartitionCreationSupplier.java b/data-prepper-plugins/s3-source/src/main/java/org/opensearch/dataprepper/plugins/source/s3/S3ScanPartitionCreationSupplier.java index 22e0a15678..66a0df271c 100644 --- a/data-prepper-plugins/s3-source/src/main/java/org/opensearch/dataprepper/plugins/source/s3/S3ScanPartitionCreationSupplier.java +++ b/data-prepper-plugins/s3-source/src/main/java/org/opensearch/dataprepper/plugins/source/s3/S3ScanPartitionCreationSupplier.java @@ -87,6 +87,7 @@ public List apply(final Map globalStateMap) if (Objects.nonNull(s3ScanKeyPathOption) && Objects.nonNull(s3ScanKeyPathOption.getS3ScanExcludeSuffixOptions())) excludeItems.addAll(s3ScanKeyPathOption.getS3ScanExcludeSuffixOptions()); + final Instant updatedScanTime = Instant.now(); if (Objects.nonNull(s3ScanKeyPathOption) && Objects.nonNull(s3ScanKeyPathOption.getS3scanIncludePrefixOptions())) s3ScanKeyPathOption.getS3scanIncludePrefixOptions().forEach(includePath -> { listObjectsV2Request.prefix(includePath); @@ -96,6 +97,8 @@ public List apply(final Map globalStateMap) else objectsToProcess.addAll(listFilteredS3ObjectsForBucket(excludeItems, listObjectsV2Request, scanOptions.getBucketOption().getName(), scanOptions.getUseStartDateTime(), scanOptions.getUseEndDateTime(), globalStateMap)); + + globalStateMap.put(scanOptions.getBucketOption().getName(), updatedScanTime.toString()); } globalStateMap.put(SCAN_COUNT, (Integer) globalStateMap.get(SCAN_COUNT) + 1); @@ -110,13 +113,13 @@ private List listFilteredS3ObjectsForBucket(final List globalStateMap) { - Instant mostRecentLastModifiedTimestamp = globalStateMap.get(bucket) != null ? Instant.parse((String) globalStateMap.get(bucket)) : null; + final Instant previousScanTime = globalStateMap.get(bucket) != null ? Instant.parse((String) globalStateMap.get(bucket)) : null; final List allPartitionIdentifiers = new ArrayList<>(); ListObjectsV2Response listObjectsV2Response = null; do { listObjectsV2Response = s3Client.listObjectsV2(listObjectsV2Request.fetchOwner(true).continuationToken(Objects.nonNull(listObjectsV2Response) ? listObjectsV2Response.nextContinuationToken() : null).build()); allPartitionIdentifiers.addAll(listObjectsV2Response.contents().stream() - .filter(s3Object -> isLastModifiedTimeAfterMostRecentScanForBucket(bucket, s3Object, globalStateMap)) + .filter(s3Object -> isLastModifiedTimeAfterMostRecentScanForBucket(previousScanTime, s3Object)) .map(s3Object -> Pair.of(s3Object.key(), instantToLocalDateTime(s3Object.lastModified()))) .filter(keyTimestampPair -> !keyTimestampPair.left().endsWith("/")) .filter(keyTimestampPair -> excludeKeyPaths.stream() @@ -127,12 +130,8 @@ private List listFilteredS3ObjectsForBucket(final List folderPartitions = allPartitionIdentifiers.stream() .map(partitionIdentifier -> { @@ -185,32 +184,13 @@ private void initializeGlobalStateMap(final Map globalStateMap) globalStateMap.put(SINGLE_SCAN_COMPLETE, false); } - private boolean isLastModifiedTimeAfterMostRecentScanForBucket(final String bucketName, - final S3Object s3Object, - final Map globalStateMap) { - if (!globalStateMap.containsKey(bucketName) || Objects.isNull(globalStateMap.get(bucketName))) { + private boolean isLastModifiedTimeAfterMostRecentScanForBucket(final Instant previousScanTime, + final S3Object s3Object) { + if (previousScanTime == null) { return true; } - final Instant lastProcessedObjectTimestamp = Instant.parse((String) globalStateMap.get(bucketName)); - - return s3Object.lastModified().compareTo(lastProcessedObjectTimestamp.minusSeconds(1)) >= 0; - } - - private Instant getMostRecentLastModifiedTimestamp(final ListObjectsV2Response listObjectsV2Response, - Instant mostRecentLastModifiedTimestamp) { - - if (Objects.isNull(schedulingOptions)) { - return null; - } - - for (final S3Object s3Object : listObjectsV2Response.contents()) { - if (Objects.isNull(mostRecentLastModifiedTimestamp) || s3Object.lastModified().isAfter(mostRecentLastModifiedTimestamp)) { - mostRecentLastModifiedTimestamp = s3Object.lastModified(); - } - } - - return mostRecentLastModifiedTimestamp; + return s3Object.lastModified().compareTo(previousScanTime) >= 0; } private boolean shouldScanBeSkipped(final Map globalStateMap) { diff --git a/data-prepper-plugins/s3-source/src/main/java/org/opensearch/dataprepper/plugins/source/s3/SqsService.java b/data-prepper-plugins/s3-source/src/main/java/org/opensearch/dataprepper/plugins/source/s3/SqsService.java index b05d2806d4..c674be5f68 100644 --- a/data-prepper-plugins/s3-source/src/main/java/org/opensearch/dataprepper/plugins/source/s3/SqsService.java +++ b/data-prepper-plugins/s3-source/src/main/java/org/opensearch/dataprepper/plugins/source/s3/SqsService.java @@ -17,9 +17,12 @@ import software.amazon.awssdk.services.sqs.SqsClient; import java.time.Duration; +import java.util.List; import java.util.concurrent.TimeUnit; import java.util.concurrent.Executors; import java.util.concurrent.ExecutorService; +import java.util.stream.Collectors; +import java.util.stream.IntStream; public class SqsService { private static final Logger LOG = LoggerFactory.getLogger(SqsService.class); @@ -34,6 +37,7 @@ public class SqsService { private final PluginMetrics pluginMetrics; private final AcknowledgementSetManager acknowledgementSetManager; private final ExecutorService executorService; + private final List sqsWorkers; public SqsService(final AcknowledgementSetManager acknowledgementSetManager, final S3SourceConfig s3SourceConfig, @@ -46,18 +50,20 @@ public SqsService(final AcknowledgementSetManager acknowledgementSetManager, this.acknowledgementSetManager = acknowledgementSetManager; this.sqsClient = createSqsClient(credentialsProvider); executorService = Executors.newFixedThreadPool(s3SourceConfig.getNumWorkers(), BackgroundThreadFactory.defaultExecutorThreadFactory("s3-source-sqs")); - } - public void start() { final Backoff backoff = Backoff.exponential(INITIAL_DELAY, MAXIMUM_DELAY).withJitter(JITTER_RATE) .withMaxAttempts(Integer.MAX_VALUE); - for (int i = 0; i < s3SourceConfig.getNumWorkers(); i++) { - executorService.submit(new SqsWorker(acknowledgementSetManager, sqsClient, s3Accessor, s3SourceConfig, pluginMetrics, backoff)); - } + sqsWorkers = IntStream.range(0, s3SourceConfig.getNumWorkers()) + .mapToObj(i -> new SqsWorker(acknowledgementSetManager, sqsClient, s3Accessor, s3SourceConfig, pluginMetrics, backoff)) + .collect(Collectors.toList()); + } + + public void start() { + sqsWorkers.forEach(executorService::submit); } SqsClient createSqsClient(final AwsCredentialsProvider credentialsProvider) { - LOG.info("Creating SQS client"); + LOG.debug("Creating SQS client"); return SqsClient.builder() .region(s3SourceConfig.getAwsAuthenticationOptions().getAwsRegion()) .credentialsProvider(credentialsProvider) @@ -68,8 +74,8 @@ SqsClient createSqsClient(final AwsCredentialsProvider credentialsProvider) { } public void stop() { - sqsClient.close(); executorService.shutdown(); + sqsWorkers.forEach(SqsWorker::stop); try { if (!executorService.awaitTermination(SHUTDOWN_TIMEOUT, TimeUnit.SECONDS)) { LOG.warn("Failed to terminate SqsWorkers"); @@ -82,5 +88,7 @@ public void stop() { Thread.currentThread().interrupt(); } } + + sqsClient.close(); } } diff --git a/data-prepper-plugins/s3-source/src/main/java/org/opensearch/dataprepper/plugins/source/s3/SqsWorker.java b/data-prepper-plugins/s3-source/src/main/java/org/opensearch/dataprepper/plugins/source/s3/SqsWorker.java index b3404cebf6..3c5fba0701 100644 --- a/data-prepper-plugins/s3-source/src/main/java/org/opensearch/dataprepper/plugins/source/s3/SqsWorker.java +++ b/data-prepper-plugins/s3-source/src/main/java/org/opensearch/dataprepper/plugins/source/s3/SqsWorker.java @@ -5,7 +5,6 @@ package org.opensearch.dataprepper.plugins.source.s3; -import com.fasterxml.jackson.databind.ObjectMapper; import com.linecorp.armeria.client.retry.Backoff; import io.micrometer.core.instrument.Counter; import io.micrometer.core.instrument.Timer; @@ -20,8 +19,7 @@ import org.opensearch.dataprepper.plugins.source.s3.filter.S3EventFilter; import org.opensearch.dataprepper.plugins.source.s3.filter.S3ObjectCreatedFilter; import org.opensearch.dataprepper.plugins.source.s3.parser.ParsedMessage; -import org.opensearch.dataprepper.plugins.source.s3.parser.S3EventBridgeNotificationParser; -import org.opensearch.dataprepper.plugins.source.s3.parser.S3EventNotificationParser; +import org.opensearch.dataprepper.plugins.source.s3.parser.SqsMessageParser; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import software.amazon.awssdk.core.exception.SdkException; @@ -75,11 +73,10 @@ public class SqsWorker implements Runnable { private final Counter sqsVisibilityTimeoutChangeFailedCount; private final Timer sqsMessageDelayTimer; private final Backoff standardBackoff; + private final SqsMessageParser sqsMessageParser; private int failedAttemptCount; private final boolean endToEndAcknowledgementsEnabled; private final AcknowledgementSetManager acknowledgementSetManager; - - private final ObjectMapper objectMapper = new ObjectMapper(); private volatile boolean isStopped = false; private Map parsedMessageVisibilityTimesMap; @@ -98,6 +95,7 @@ public SqsWorker(final AcknowledgementSetManager acknowledgementSetManager, sqsOptions = s3SourceConfig.getSqsOptions(); objectCreatedFilter = new S3ObjectCreatedFilter(); evenBridgeObjectCreatedFilter = new EventBridgeObjectCreatedFilter(); + sqsMessageParser = new SqsMessageParser(s3SourceConfig); failedAttemptCount = 0; parsedMessageVisibilityTimesMap = new HashMap<>(); @@ -139,7 +137,7 @@ int processSqsMessages() { if (!sqsMessages.isEmpty()) { sqsMessagesReceivedCounter.increment(sqsMessages.size()); - final Collection s3MessageEventNotificationRecords = getS3MessageEventNotificationRecords(sqsMessages); + final Collection s3MessageEventNotificationRecords = sqsMessageParser.parseSqsMessages(sqsMessages); // build s3ObjectReference from S3EventNotificationRecord if event name starts with ObjectCreated final List deleteMessageBatchRequestEntries = processS3EventNotificationRecords(s3MessageEventNotificationRecords); @@ -191,22 +189,6 @@ private ReceiveMessageRequest createReceiveMessageRequest() { .build(); } - private Collection getS3MessageEventNotificationRecords(final List sqsMessages) { - return sqsMessages.stream() - .map(this::convertS3EventMessages) - .collect(Collectors.toList()); - } - - private ParsedMessage convertS3EventMessages(final Message message) { - if (s3SourceConfig.getNotificationSource().equals(NotificationSourceOption.S3)) { - return new S3EventNotificationParser().parseMessage(message, objectMapper); - } - else if (s3SourceConfig.getNotificationSource().equals(NotificationSourceOption.EVENTBRIDGE)) { - return new S3EventBridgeNotificationParser().parseMessage(message, objectMapper); - } - return new ParsedMessage(message, true); - } - private List processS3EventNotificationRecords(final Collection s3EventNotificationRecords) { final List deleteMessageBatchRequestEntryCollection = new ArrayList<>(); final List parsedMessagesToRead = new ArrayList<>(); @@ -276,21 +258,7 @@ && isEventBridgeEventTypeCreated(parsedMessage)) { return; } parsedMessageVisibilityTimesMap.put(parsedMessage, newValue); - final ChangeMessageVisibilityRequest changeMessageVisibilityRequest = ChangeMessageVisibilityRequest.builder() - .visibilityTimeout(newVisibilityTimeoutSeconds) - .queueUrl(sqsOptions.getSqsUrl()) - .receiptHandle(parsedMessage.getMessage().receiptHandle()) - .build(); - - try { - sqsClient.changeMessageVisibility(changeMessageVisibilityRequest); - sqsVisibilityTimeoutChangedCount.increment(); - LOG.debug("Set visibility timeout for message {} to {}", parsedMessage.getMessage().messageId(), newVisibilityTimeoutSeconds); - } catch (Exception e) { - LOG.error("Failed to set visibility timeout for message {} to {}", parsedMessage.getMessage().messageId(), newVisibilityTimeoutSeconds, e); - sqsVisibilityTimeoutChangeFailedCount.increment(); - } - + increaseVisibilityTimeout(parsedMessage, newVisibilityTimeoutSeconds); }, Duration.ofSeconds(progressCheckInterval)); } @@ -308,6 +276,27 @@ && isEventBridgeEventTypeCreated(parsedMessage)) { return deleteMessageBatchRequestEntryCollection; } + private void increaseVisibilityTimeout(final ParsedMessage parsedMessage, final int newVisibilityTimeoutSeconds) { + if(isStopped) { + LOG.info("Some messages are pending completion of acknowledgments. Data Prepper will not increase the visibility timeout because it is shutting down. {}", parsedMessage); + return; + } + final ChangeMessageVisibilityRequest changeMessageVisibilityRequest = ChangeMessageVisibilityRequest.builder() + .visibilityTimeout(newVisibilityTimeoutSeconds) + .queueUrl(sqsOptions.getSqsUrl()) + .receiptHandle(parsedMessage.getMessage().receiptHandle()) + .build(); + + try { + sqsClient.changeMessageVisibility(changeMessageVisibilityRequest); + sqsVisibilityTimeoutChangedCount.increment(); + LOG.debug("Set visibility timeout for message {} to {}", parsedMessage.getMessage().messageId(), newVisibilityTimeoutSeconds); + } catch (Exception e) { + LOG.error("Failed to set visibility timeout for message {} to {}", parsedMessage.getMessage().messageId(), newVisibilityTimeoutSeconds, e); + sqsVisibilityTimeoutChangeFailedCount.increment(); + } + } + private Optional processS3Object( final ParsedMessage parsedMessage, final S3ObjectReference s3ObjectReference, @@ -328,6 +317,8 @@ private Optional processS3Object( } private void deleteSqsMessages(final List deleteMessageBatchRequestEntryCollection) { + if(isStopped) + return; if (deleteMessageBatchRequestEntryCollection.size() == 0) { return; } @@ -396,6 +387,5 @@ private S3ObjectReference populateS3Reference(final String bucketName, final Str void stop() { isStopped = true; - Thread.currentThread().interrupt(); } } diff --git a/data-prepper-plugins/s3-source/src/main/java/org/opensearch/dataprepper/plugins/source/s3/parser/ParsedMessage.java b/data-prepper-plugins/s3-source/src/main/java/org/opensearch/dataprepper/plugins/source/s3/parser/ParsedMessage.java index 18bbc58499..ed68dff063 100644 --- a/data-prepper-plugins/s3-source/src/main/java/org/opensearch/dataprepper/plugins/source/s3/parser/ParsedMessage.java +++ b/data-prepper-plugins/s3-source/src/main/java/org/opensearch/dataprepper/plugins/source/s3/parser/ParsedMessage.java @@ -11,6 +11,7 @@ import software.amazon.awssdk.services.sqs.model.Message; import java.util.List; +import java.util.Objects; public class ParsedMessage { private final Message message; @@ -24,14 +25,14 @@ public class ParsedMessage { private String detailType; public ParsedMessage(final Message message, final boolean failedParsing) { - this.message = message; + this.message = Objects.requireNonNull(message); this.failedParsing = failedParsing; this.emptyNotification = true; } - // S3EventNotification contains only one S3EventNotificationRecord ParsedMessage(final Message message, final List notificationRecords) { - this.message = message; + this.message = Objects.requireNonNull(message); + // S3EventNotification contains only one S3EventNotificationRecord this.bucketName = notificationRecords.get(0).getS3().getBucket().getName(); this.objectKey = notificationRecords.get(0).getS3().getObject().getUrlDecodedKey(); this.objectSize = notificationRecords.get(0).getS3().getObject().getSizeAsLong(); @@ -42,7 +43,7 @@ public ParsedMessage(final Message message, final boolean failedParsing) { } ParsedMessage(final Message message, final S3EventBridgeNotification eventBridgeNotification) { - this.message = message; + this.message = Objects.requireNonNull(message); this.bucketName = eventBridgeNotification.getDetail().getBucket().getName(); this.objectKey = eventBridgeNotification.getDetail().getObject().getUrlDecodedKey(); this.objectSize = eventBridgeNotification.getDetail().getObject().getSize(); @@ -85,4 +86,12 @@ public boolean isEmptyNotification() { public String getDetailType() { return detailType; } + + @Override + public String toString() { + return "Message{" + + "messageId=" + message.messageId() + + ", objectKey=" + objectKey + + '}'; + } } diff --git a/data-prepper-plugins/s3-source/src/main/java/org/opensearch/dataprepper/plugins/source/s3/parser/SqsMessageParser.java b/data-prepper-plugins/s3-source/src/main/java/org/opensearch/dataprepper/plugins/source/s3/parser/SqsMessageParser.java new file mode 100644 index 0000000000..ea40e3f041 --- /dev/null +++ b/data-prepper-plugins/s3-source/src/main/java/org/opensearch/dataprepper/plugins/source/s3/parser/SqsMessageParser.java @@ -0,0 +1,44 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.s3.parser; + +import com.fasterxml.jackson.databind.ObjectMapper; +import org.opensearch.dataprepper.plugins.source.s3.S3SourceConfig; +import software.amazon.awssdk.services.sqs.model.Message; + +import java.util.Collection; +import java.util.stream.Collectors; + +public class SqsMessageParser { + private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper(); + private final S3SourceConfig s3SourceConfig; + private final S3NotificationParser s3NotificationParser; + + public SqsMessageParser(final S3SourceConfig s3SourceConfig) { + this.s3SourceConfig = s3SourceConfig; + s3NotificationParser = createNotificationParser(s3SourceConfig); + } + + public Collection parseSqsMessages(final Collection sqsMessages) { + return sqsMessages.stream() + .map(this::convertS3EventMessages) + .collect(Collectors.toList()); + } + + private ParsedMessage convertS3EventMessages(final Message message) { + return s3NotificationParser.parseMessage(message, OBJECT_MAPPER); + } + + private static S3NotificationParser createNotificationParser(final S3SourceConfig s3SourceConfig) { + switch (s3SourceConfig.getNotificationSource()) { + case EVENTBRIDGE: + return new S3EventBridgeNotificationParser(); + case S3: + default: + return new S3EventNotificationParser(); + } + } +} diff --git a/data-prepper-plugins/s3-source/src/test/java/org/opensearch/dataprepper/plugins/source/s3/S3ScanPartitionCreationSupplierTest.java b/data-prepper-plugins/s3-source/src/test/java/org/opensearch/dataprepper/plugins/source/s3/S3ScanPartitionCreationSupplierTest.java index e320981b9d..0545a49459 100644 --- a/data-prepper-plugins/s3-source/src/test/java/org/opensearch/dataprepper/plugins/source/s3/S3ScanPartitionCreationSupplierTest.java +++ b/data-prepper-plugins/s3-source/src/test/java/org/opensearch/dataprepper/plugins/source/s3/S3ScanPartitionCreationSupplierTest.java @@ -40,6 +40,9 @@ import static org.hamcrest.MatcherAssert.assertThat; import static org.hamcrest.Matchers.containsInAnyOrder; import static org.hamcrest.Matchers.equalTo; +import static org.hamcrest.Matchers.greaterThan; +import static org.hamcrest.Matchers.greaterThanOrEqualTo; +import static org.hamcrest.Matchers.lessThanOrEqualTo; import static org.hamcrest.Matchers.notNullValue; import static org.mockito.BDDMockito.given; import static org.mockito.Mockito.mock; @@ -213,7 +216,7 @@ void getNextPartition_supplier_with_scheduling_options_returns_expected_Partitio s3ObjectsList.add(invalidForFirstBucketSuffixObject); expectedPartitionIdentifiers.add(PartitionIdentifier.builder().withPartitionKey(secondBucket + "|" + invalidForFirstBucketSuffixObject.key()).build()); - final Instant mostRecentFirstScan = Instant.now().plusSeconds(1); + final Instant mostRecentFirstScan = Instant.now().plusSeconds(2); final S3Object validObject = mock(S3Object.class); given(validObject.key()).willReturn("valid"); given(validObject.lastModified()).willReturn(mostRecentFirstScan); @@ -237,10 +240,6 @@ void getNextPartition_supplier_with_scheduling_options_returns_expected_Partitio given(listObjectsResponse.contents()) .willReturn(s3ObjectsList) .willReturn(s3ObjectsList) - .willReturn(s3ObjectsList) - .willReturn(s3ObjectsList) - .willReturn(secondScanObjects) - .willReturn(secondScanObjects) .willReturn(secondScanObjects) .willReturn(secondScanObjects); @@ -248,6 +247,8 @@ void getNextPartition_supplier_with_scheduling_options_returns_expected_Partitio given(s3Client.listObjectsV2(listObjectsV2RequestArgumentCaptor.capture())).willReturn(listObjectsResponse); final Map globalStateMap = new HashMap<>(); + + final Instant beforeFirstScan = Instant.now(); final List resultingPartitions = partitionCreationSupplier.apply(globalStateMap); assertThat(resultingPartitions, notNullValue()); @@ -260,10 +261,13 @@ void getNextPartition_supplier_with_scheduling_options_returns_expected_Partitio assertThat(globalStateMap.containsKey(SCAN_COUNT), equalTo(true)); assertThat(globalStateMap.get(SCAN_COUNT), equalTo(1)); assertThat(globalStateMap.containsKey(firstBucket), equalTo(true)); - assertThat(globalStateMap.get(firstBucket), equalTo(mostRecentFirstScan.toString())); + assertThat(Instant.parse((CharSequence) globalStateMap.get(firstBucket)), lessThanOrEqualTo(mostRecentFirstScan)); + assertThat(Instant.parse((CharSequence) globalStateMap.get(firstBucket)), greaterThanOrEqualTo(beforeFirstScan)); assertThat(globalStateMap.containsKey(secondBucket), equalTo(true)); - assertThat(globalStateMap.get(secondBucket), equalTo(mostRecentFirstScan.toString())); + assertThat(Instant.parse((CharSequence) globalStateMap.get(secondBucket)), lessThanOrEqualTo(mostRecentFirstScan)); + assertThat(Instant.parse((CharSequence) globalStateMap.get(secondBucket)), greaterThanOrEqualTo(beforeFirstScan)); + final Instant beforeSecondScan = Instant.now(); final List secondScanPartitions = partitionCreationSupplier.apply(globalStateMap); assertThat(secondScanPartitions.size(), equalTo(expectedPartitionIdentifiersSecondScan.size())); assertThat(secondScanPartitions.stream().map(PartitionIdentifier::getPartitionKey).collect(Collectors.toList()), @@ -273,14 +277,16 @@ void getNextPartition_supplier_with_scheduling_options_returns_expected_Partitio assertThat(globalStateMap.containsKey(SCAN_COUNT), equalTo(true)); assertThat(globalStateMap.get(SCAN_COUNT), equalTo(2)); assertThat(globalStateMap.containsKey(firstBucket), equalTo(true)); - assertThat(globalStateMap.get(firstBucket), equalTo(mostRecentSecondScan.toString())); + assertThat(Instant.parse((CharSequence) globalStateMap.get(firstBucket)), lessThanOrEqualTo(mostRecentSecondScan)); + assertThat(Instant.parse((CharSequence) globalStateMap.get(firstBucket)), greaterThanOrEqualTo(beforeSecondScan)); assertThat(globalStateMap.containsKey(secondBucket), equalTo(true)); - assertThat(globalStateMap.get(secondBucket), equalTo(mostRecentSecondScan.toString())); + assertThat(Instant.parse((CharSequence) globalStateMap.get(secondBucket)), lessThanOrEqualTo(mostRecentSecondScan)); + assertThat(Instant.parse((CharSequence) globalStateMap.get(secondBucket)), greaterThan(beforeSecondScan)); assertThat(Instant.ofEpochMilli((Long) globalStateMap.get(LAST_SCAN_TIME)).isBefore(Instant.now()), equalTo(true)); assertThat(partitionCreationSupplier.apply(globalStateMap), equalTo(Collections.emptyList())); - verify(listObjectsResponse, times(8)).contents(); + verify(listObjectsResponse, times(4)).contents(); } @Test diff --git a/data-prepper-plugins/s3-source/src/test/java/org/opensearch/dataprepper/plugins/source/s3/SqsWorkerTest.java b/data-prepper-plugins/s3-source/src/test/java/org/opensearch/dataprepper/plugins/source/s3/SqsWorkerTest.java index 50ed879f4a..ada789cea6 100644 --- a/data-prepper-plugins/s3-source/src/test/java/org/opensearch/dataprepper/plugins/source/s3/SqsWorkerTest.java +++ b/data-prepper-plugins/s3-source/src/test/java/org/opensearch/dataprepper/plugins/source/s3/SqsWorkerTest.java @@ -12,6 +12,7 @@ import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Nested; import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; import org.junit.jupiter.api.extension.ExtensionContext; import org.junit.jupiter.params.ParameterizedTest; import org.junit.jupiter.params.provider.Arguments; @@ -19,19 +20,21 @@ import org.junit.jupiter.params.provider.ArgumentsSource; import org.junit.jupiter.params.provider.ValueSource; import org.mockito.ArgumentCaptor; +import org.mockito.Mock; +import org.mockito.junit.jupiter.MockitoExtension; import org.opensearch.dataprepper.metrics.PluginMetrics; import org.opensearch.dataprepper.model.acknowledgements.AcknowledgementSet; import org.opensearch.dataprepper.model.acknowledgements.AcknowledgementSetManager; -import org.opensearch.dataprepper.plugins.source.s3.configuration.AwsAuthenticationOptions; +import org.opensearch.dataprepper.model.acknowledgements.ProgressCheck; import org.opensearch.dataprepper.plugins.source.s3.configuration.NotificationSourceOption; import org.opensearch.dataprepper.plugins.source.s3.configuration.OnErrorOption; import org.opensearch.dataprepper.plugins.source.s3.configuration.SqsOptions; import org.opensearch.dataprepper.plugins.source.s3.exception.SqsRetriesExhaustedException; import org.opensearch.dataprepper.plugins.source.s3.filter.S3EventFilter; import org.opensearch.dataprepper.plugins.source.s3.filter.S3ObjectCreatedFilter; -import software.amazon.awssdk.regions.Region; import software.amazon.awssdk.services.sqs.SqsClient; import software.amazon.awssdk.services.sqs.model.BatchResultErrorEntry; +import software.amazon.awssdk.services.sqs.model.ChangeMessageVisibilityRequest; import software.amazon.awssdk.services.sqs.model.DeleteMessageBatchRequest; import software.amazon.awssdk.services.sqs.model.DeleteMessageBatchResponse; import software.amazon.awssdk.services.sqs.model.DeleteMessageBatchResultEntry; @@ -50,6 +53,7 @@ import java.util.Collections; import java.util.List; import java.util.UUID; +import java.util.function.Consumer; import java.util.stream.Collectors; import java.util.stream.IntStream; import java.util.stream.Stream; @@ -65,20 +69,23 @@ import static org.mockito.ArgumentMatchers.anyInt; import static org.mockito.Mockito.mock; import static org.mockito.Mockito.never; +import static org.mockito.Mockito.reset; import static org.mockito.Mockito.times; import static org.mockito.Mockito.verify; import static org.mockito.Mockito.verifyNoInteractions; import static org.mockito.Mockito.verifyNoMoreInteractions; import static org.mockito.Mockito.when; +import static org.opensearch.dataprepper.plugins.source.s3.SqsWorker.ACKNOWLEDGEMENT_SET_CALLACK_METRIC_NAME; +import static org.opensearch.dataprepper.plugins.source.s3.SqsWorker.S3_OBJECTS_EMPTY_METRIC_NAME; import static org.opensearch.dataprepper.plugins.source.s3.SqsWorker.SQS_MESSAGES_DELETED_METRIC_NAME; import static org.opensearch.dataprepper.plugins.source.s3.SqsWorker.SQS_MESSAGES_DELETE_FAILED_METRIC_NAME; import static org.opensearch.dataprepper.plugins.source.s3.SqsWorker.SQS_MESSAGES_FAILED_METRIC_NAME; import static org.opensearch.dataprepper.plugins.source.s3.SqsWorker.SQS_MESSAGES_RECEIVED_METRIC_NAME; import static org.opensearch.dataprepper.plugins.source.s3.SqsWorker.SQS_MESSAGE_DELAY_METRIC_NAME; -import static org.opensearch.dataprepper.plugins.source.s3.SqsWorker.S3_OBJECTS_EMPTY_METRIC_NAME; +import static org.opensearch.dataprepper.plugins.source.s3.SqsWorker.SQS_VISIBILITY_TIMEOUT_CHANGED_COUNT_METRIC_NAME; +@ExtendWith(MockitoExtension.class) class SqsWorkerTest { - private SqsWorker sqsWorker; private SqsClient sqsClient; private S3Service s3Service; private S3SourceConfig s3SourceConfig; @@ -90,10 +97,13 @@ class SqsWorkerTest { private Counter sqsMessagesFailedCounter; private Counter sqsMessagesDeleteFailedCounter; private Counter s3ObjectsEmptyCounter; + @Mock + private Counter sqsVisibilityTimeoutChangedCount; private Timer sqsMessageDelayTimer; private AcknowledgementSetManager acknowledgementSetManager; private AcknowledgementSet acknowledgementSet; private SqsOptions sqsOptions; + private String queueUrl; @BeforeEach void setUp() { @@ -105,15 +115,11 @@ void setUp() { objectCreatedFilter = new S3ObjectCreatedFilter(); backoff = mock(Backoff.class); - AwsAuthenticationOptions awsAuthenticationOptions = mock(AwsAuthenticationOptions.class); - when(awsAuthenticationOptions.getAwsRegion()).thenReturn(Region.US_EAST_1); - sqsOptions = mock(SqsOptions.class); - when(sqsOptions.getSqsUrl()).thenReturn("https://sqs.us-east-2.amazonaws.com/123456789012/MyQueue"); + queueUrl = "https://sqs.us-east-2.amazonaws.com/123456789012/" + UUID.randomUUID(); + when(sqsOptions.getSqsUrl()).thenReturn(queueUrl); - when(s3SourceConfig.getAwsAuthenticationOptions()).thenReturn(awsAuthenticationOptions); when(s3SourceConfig.getSqsOptions()).thenReturn(sqsOptions); - when(s3SourceConfig.getOnErrorOption()).thenReturn(OnErrorOption.RETAIN_MESSAGES); when(s3SourceConfig.getAcknowledgements()).thenReturn(false); when(s3SourceConfig.getNotificationSource()).thenReturn(NotificationSourceOption.S3); @@ -130,8 +136,12 @@ void setUp() { when(pluginMetrics.counter(SQS_MESSAGES_DELETE_FAILED_METRIC_NAME)).thenReturn(sqsMessagesDeleteFailedCounter); when(pluginMetrics.counter(S3_OBJECTS_EMPTY_METRIC_NAME)).thenReturn(s3ObjectsEmptyCounter); when(pluginMetrics.timer(SQS_MESSAGE_DELAY_METRIC_NAME)).thenReturn(sqsMessageDelayTimer); + when(pluginMetrics.counter(ACKNOWLEDGEMENT_SET_CALLACK_METRIC_NAME)).thenReturn(mock(Counter.class)); + when(pluginMetrics.counter(SQS_VISIBILITY_TIMEOUT_CHANGED_COUNT_METRIC_NAME)).thenReturn(sqsVisibilityTimeoutChangedCount); + } - sqsWorker = new SqsWorker(acknowledgementSetManager, sqsClient, s3Service, s3SourceConfig, pluginMetrics, backoff); + private SqsWorker createObjectUnderTest() { + return new SqsWorker(acknowledgementSetManager, sqsClient, s3Service, s3SourceConfig, pluginMetrics, backoff); } @AfterEach @@ -167,7 +177,7 @@ void processSqsMessages_should_return_number_of_messages_processed(final String when(sqsClient.receiveMessage(any(ReceiveMessageRequest.class))).thenReturn(receiveMessageResponse); when(receiveMessageResponse.messages()).thenReturn(Collections.singletonList(message)); - final int messagesProcessed = sqsWorker.processSqsMessages(); + final int messagesProcessed = createObjectUnderTest().processSqsMessages(); final ArgumentCaptor deleteMessageBatchRequestArgumentCaptor = ArgumentCaptor.forClass(DeleteMessageBatchRequest.class); verify(sqsClient).deleteMessageBatch(deleteMessageBatchRequestArgumentCaptor.capture()); final DeleteMessageBatchRequest actualDeleteMessageBatchRequest = deleteMessageBatchRequestArgumentCaptor.getValue(); @@ -190,93 +200,6 @@ void processSqsMessages_should_return_number_of_messages_processed(final String assertThat(actualDelay, greaterThanOrEqualTo(Duration.ofHours(1).minus(Duration.ofSeconds(5)))); } - @ParameterizedTest - @ValueSource(strings = {"ObjectCreated:Put", "ObjectCreated:Post", "ObjectCreated:Copy", "ObjectCreated:CompleteMultipartUpload"}) - void processSqsMessages_should_return_number_of_messages_processed_with_acknowledgements(final String eventName) throws IOException { - when(acknowledgementSetManager.create(any(), any(Duration.class))).thenReturn(acknowledgementSet); - when(s3SourceConfig.getAcknowledgements()).thenReturn(true); - sqsWorker = new SqsWorker(acknowledgementSetManager, sqsClient, s3Service, s3SourceConfig, pluginMetrics, backoff); - Instant startTime = Instant.now().minus(1, ChronoUnit.HOURS); - final Message message = mock(Message.class); - when(message.body()).thenReturn(createEventNotification(eventName, startTime)); - final String testReceiptHandle = UUID.randomUUID().toString(); - when(message.messageId()).thenReturn(testReceiptHandle); - when(message.receiptHandle()).thenReturn(testReceiptHandle); - - final ReceiveMessageResponse receiveMessageResponse = mock(ReceiveMessageResponse.class); - when(sqsClient.receiveMessage(any(ReceiveMessageRequest.class))).thenReturn(receiveMessageResponse); - when(receiveMessageResponse.messages()).thenReturn(Collections.singletonList(message)); - - final int messagesProcessed = sqsWorker.processSqsMessages(); - final ArgumentCaptor deleteMessageBatchRequestArgumentCaptor = ArgumentCaptor.forClass(DeleteMessageBatchRequest.class); - - final ArgumentCaptor durationArgumentCaptor = ArgumentCaptor.forClass(Duration.class); - verify(sqsMessageDelayTimer).record(durationArgumentCaptor.capture()); - Duration actualDelay = durationArgumentCaptor.getValue(); - - assertThat(messagesProcessed, equalTo(1)); - verify(s3Service).addS3Object(any(S3ObjectReference.class), any()); - verify(acknowledgementSetManager).create(any(), any(Duration.class)); - verify(sqsMessagesReceivedCounter).increment(1); - verifyNoInteractions(sqsMessagesDeletedCounter); - assertThat(actualDelay, lessThanOrEqualTo(Duration.ofHours(1).plus(Duration.ofSeconds(5)))); - assertThat(actualDelay, greaterThanOrEqualTo(Duration.ofHours(1).minus(Duration.ofSeconds(5)))); - } - - @ParameterizedTest - @ValueSource(strings = {"ObjectCreated:Put", "ObjectCreated:Post", "ObjectCreated:Copy", "ObjectCreated:CompleteMultipartUpload"}) - void processSqsMessages_should_return_number_of_messages_processed_with_acknowledgements_and_progress_check(final String eventName) throws IOException { - when(sqsOptions.getVisibilityDuplicateProtection()).thenReturn(true); - when(sqsOptions.getVisibilityTimeout()).thenReturn(Duration.ofSeconds(6)); - when(acknowledgementSetManager.create(any(), any(Duration.class))).thenReturn(acknowledgementSet); - when(s3SourceConfig.getAcknowledgements()).thenReturn(true); - sqsWorker = new SqsWorker(acknowledgementSetManager, sqsClient, s3Service, s3SourceConfig, pluginMetrics, backoff); - Instant startTime = Instant.now().minus(1, ChronoUnit.HOURS); - final Message message = mock(Message.class); - when(message.body()).thenReturn(createEventNotification(eventName, startTime)); - final String testReceiptHandle = UUID.randomUUID().toString(); - when(message.messageId()).thenReturn(testReceiptHandle); - when(message.receiptHandle()).thenReturn(testReceiptHandle); - - final ReceiveMessageResponse receiveMessageResponse = mock(ReceiveMessageResponse.class); - when(sqsClient.receiveMessage(any(ReceiveMessageRequest.class))).thenReturn(receiveMessageResponse); - when(receiveMessageResponse.messages()).thenReturn(Collections.singletonList(message)); - - final int messagesProcessed = sqsWorker.processSqsMessages(); - final ArgumentCaptor deleteMessageBatchRequestArgumentCaptor = ArgumentCaptor.forClass(DeleteMessageBatchRequest.class); - - final ArgumentCaptor durationArgumentCaptor = ArgumentCaptor.forClass(Duration.class); - verify(sqsMessageDelayTimer).record(durationArgumentCaptor.capture()); - Duration actualDelay = durationArgumentCaptor.getValue(); - - assertThat(messagesProcessed, equalTo(1)); - verify(s3Service).addS3Object(any(S3ObjectReference.class), any()); - verify(acknowledgementSetManager).create(any(), any(Duration.class)); - verify(acknowledgementSet).addProgressCheck(any(), any(Duration.class)); - verify(sqsMessagesReceivedCounter).increment(1); - verifyNoInteractions(sqsMessagesDeletedCounter); - assertThat(actualDelay, lessThanOrEqualTo(Duration.ofHours(1).plus(Duration.ofSeconds(5)))); - assertThat(actualDelay, greaterThanOrEqualTo(Duration.ofHours(1).minus(Duration.ofSeconds(5)))); - } - - @ParameterizedTest - @ValueSource(strings = {"", "{\"foo\": \"bar\""}) - void processSqsMessages_should_not_interact_with_S3Service_if_input_is_not_valid_JSON(String inputString) { - final Message message = mock(Message.class); - when(message.body()).thenReturn(inputString); - - final ReceiveMessageResponse receiveMessageResponse = mock(ReceiveMessageResponse.class); - when(sqsClient.receiveMessage(any(ReceiveMessageRequest.class))).thenReturn(receiveMessageResponse); - when(receiveMessageResponse.messages()).thenReturn(Collections.singletonList(message)); - - final int messagesProcessed = sqsWorker.processSqsMessages(); - assertThat(messagesProcessed, equalTo(1)); - verifyNoInteractions(s3Service); - verify(sqsClient, never()).deleteMessageBatch(any(DeleteMessageBatchRequest.class)); - verify(sqsMessagesReceivedCounter).increment(1); - verify(sqsMessagesFailedCounter).increment(); - } - @Test void processSqsMessages_should_not_interact_with_S3Service_and_delete_message_if_TestEvent() { final String messageId = UUID.randomUUID().toString(); @@ -291,7 +214,7 @@ void processSqsMessages_should_not_interact_with_S3Service_and_delete_message_if when(sqsClient.receiveMessage(any(ReceiveMessageRequest.class))).thenReturn(receiveMessageResponse); when(receiveMessageResponse.messages()).thenReturn(Collections.singletonList(message)); - final int messagesProcessed = sqsWorker.processSqsMessages(); + final int messagesProcessed = createObjectUnderTest().processSqsMessages(); assertThat(messagesProcessed, equalTo(1)); verifyNoInteractions(s3Service); @@ -324,7 +247,7 @@ void processSqsMessages_should_not_interact_with_S3Service_and_delete_message_if when(sqsClient.receiveMessage(any(ReceiveMessageRequest.class))).thenReturn(receiveMessageResponse); when(receiveMessageResponse.messages()).thenReturn(Collections.singletonList(message)); - final int messagesProcessed = sqsWorker.processSqsMessages(); + final int messagesProcessed = createObjectUnderTest().processSqsMessages(); assertThat(messagesProcessed, equalTo(1)); verifyNoInteractions(s3Service); @@ -354,7 +277,7 @@ void processSqsMessages_with_irrelevant_eventName_should_return_number_of_messag when(sqsClient.receiveMessage(any(ReceiveMessageRequest.class))).thenReturn(receiveMessageResponse); when(receiveMessageResponse.messages()).thenReturn(Collections.singletonList(message)); - final int messagesProcessed = sqsWorker.processSqsMessages(); + final int messagesProcessed = createObjectUnderTest().processSqsMessages(); assertThat(messagesProcessed, equalTo(1)); verifyNoInteractions(s3Service); @@ -378,7 +301,7 @@ void processSqsMessages_should_invoke_delete_if_input_is_not_valid_JSON_and_dele when(sqsClient.receiveMessage(any(ReceiveMessageRequest.class))).thenReturn(receiveMessageResponse); when(receiveMessageResponse.messages()).thenReturn(Collections.singletonList(message)); - final int messagesProcessed = sqsWorker.processSqsMessages(); + final int messagesProcessed = createObjectUnderTest().processSqsMessages(); final ArgumentCaptor deleteMessageBatchRequestArgumentCaptor = ArgumentCaptor.forClass(DeleteMessageBatchRequest.class); verify(sqsClient).deleteMessageBatch(deleteMessageBatchRequestArgumentCaptor.capture()); final DeleteMessageBatchRequest actualDeleteMessageBatchRequest = deleteMessageBatchRequestArgumentCaptor.getValue(); @@ -410,7 +333,7 @@ void processSqsMessages_should_return_number_of_messages_processed_when_using_Ev when(sqsClient.receiveMessage(any(ReceiveMessageRequest.class))).thenReturn(receiveMessageResponse); when(receiveMessageResponse.messages()).thenReturn(Collections.singletonList(message)); - final int messagesProcessed = sqsWorker.processSqsMessages(); + final int messagesProcessed = createObjectUnderTest().processSqsMessages(); final ArgumentCaptor deleteMessageBatchRequestArgumentCaptor = ArgumentCaptor.forClass(DeleteMessageBatchRequest.class); verify(sqsClient).deleteMessageBatch(deleteMessageBatchRequestArgumentCaptor.capture()); final DeleteMessageBatchRequest actualDeleteMessageBatchRequest = deleteMessageBatchRequestArgumentCaptor.getValue(); @@ -447,7 +370,7 @@ void processSqsMessages_should_return_number_of_messages_processed_when_using_Se when(sqsClient.receiveMessage(any(ReceiveMessageRequest.class))).thenReturn(receiveMessageResponse); when(receiveMessageResponse.messages()).thenReturn(Collections.singletonList(message)); - final int messagesProcessed = sqsWorker.processSqsMessages(); + final int messagesProcessed = createObjectUnderTest().processSqsMessages(); final ArgumentCaptor deleteMessageBatchRequestArgumentCaptor = ArgumentCaptor.forClass(DeleteMessageBatchRequest.class); verify(sqsClient).deleteMessageBatch(deleteMessageBatchRequestArgumentCaptor.capture()); final DeleteMessageBatchRequest actualDeleteMessageBatchRequest = deleteMessageBatchRequestArgumentCaptor.getValue(); @@ -502,7 +425,7 @@ void processSqsMessages_should_report_correct_metrics_for_DeleteMessages_when_so when(deleteMessageBatchResponse.failed()).thenReturn(failedDeletes); when(sqsClient.deleteMessageBatch(any(DeleteMessageBatchRequest.class))).thenReturn(deleteMessageBatchResponse); - final int messagesProcessed = sqsWorker.processSqsMessages(); + final int messagesProcessed = createObjectUnderTest().processSqsMessages(); final ArgumentCaptor deleteMessageBatchRequestArgumentCaptor = ArgumentCaptor.forClass(DeleteMessageBatchRequest.class); verify(sqsClient).deleteMessageBatch(deleteMessageBatchRequestArgumentCaptor.capture()); @@ -542,7 +465,7 @@ void processSqsMessages_should_report_correct_metrics_for_DeleteMessages_when_re when(sqsClient.deleteMessageBatch(any(DeleteMessageBatchRequest.class))).thenThrow(exClass); - final int messagesProcessed = sqsWorker.processSqsMessages(); + final int messagesProcessed = createObjectUnderTest().processSqsMessages(); final ArgumentCaptor deleteMessageBatchRequestArgumentCaptor = ArgumentCaptor.forClass(DeleteMessageBatchRequest.class); verify(sqsClient).deleteMessageBatch(deleteMessageBatchRequestArgumentCaptor.capture()); @@ -565,7 +488,7 @@ void processSqsMessages_should_report_correct_metrics_for_DeleteMessages_when_re @Test void processSqsMessages_should_return_zero_messages_when_a_SqsException_is_thrown() { when(sqsClient.receiveMessage(any(ReceiveMessageRequest.class))).thenThrow(SqsException.class); - final int messagesProcessed = sqsWorker.processSqsMessages(); + final int messagesProcessed = createObjectUnderTest().processSqsMessages(); assertThat(messagesProcessed, equalTo(0)); verify(sqsClient, never()).deleteMessageBatch(any(DeleteMessageBatchRequest.class)); } @@ -573,7 +496,7 @@ void processSqsMessages_should_return_zero_messages_when_a_SqsException_is_throw @Test void processSqsMessages_should_return_zero_messages_with_backoff_when_a_SqsException_is_thrown() { when(sqsClient.receiveMessage(any(ReceiveMessageRequest.class))).thenThrow(SqsException.class); - final int messagesProcessed = sqsWorker.processSqsMessages(); + final int messagesProcessed = createObjectUnderTest().processSqsMessages(); verify(backoff).nextDelayMillis(1); assertThat(messagesProcessed, equalTo(0)); } @@ -582,7 +505,8 @@ void processSqsMessages_should_return_zero_messages_with_backoff_when_a_SqsExcep void processSqsMessages_should_throw_when_a_SqsException_is_thrown_with_max_retries() { when(sqsClient.receiveMessage(any(ReceiveMessageRequest.class))).thenThrow(SqsException.class); when(backoff.nextDelayMillis(anyInt())).thenReturn((long) -1); - assertThrows(SqsRetriesExhaustedException.class, () -> sqsWorker.processSqsMessages()); + SqsWorker objectUnderTest = createObjectUnderTest(); + assertThrows(SqsRetriesExhaustedException.class, () -> objectUnderTest.processSqsMessages()); } @ParameterizedTest @@ -591,11 +515,13 @@ void processSqsMessages_should_return_zero_messages_when_messages_are_not_S3Even final Message message = mock(Message.class); when(message.body()).thenReturn(inputString); + when(s3SourceConfig.getOnErrorOption()).thenReturn(OnErrorOption.RETAIN_MESSAGES); + final ReceiveMessageResponse receiveMessageResponse = mock(ReceiveMessageResponse.class); when(sqsClient.receiveMessage(any(ReceiveMessageRequest.class))).thenReturn(receiveMessageResponse); when(receiveMessageResponse.messages()).thenReturn(Collections.singletonList(message)); - final int messagesProcessed = sqsWorker.processSqsMessages(); + final int messagesProcessed = createObjectUnderTest().processSqsMessages(); assertThat(messagesProcessed, equalTo(1)); verifyNoInteractions(s3Service); verify(sqsClient, never()).deleteMessageBatch(any(DeleteMessageBatchRequest.class)); @@ -605,6 +531,7 @@ void processSqsMessages_should_return_zero_messages_when_messages_are_not_S3Even @Test void populateS3Reference_should_interact_with_getUrlDecodedKey() throws NoSuchMethodException, InvocationTargetException, IllegalAccessException { + reset(sqsOptions); // Using reflection to unit test a private method as part of bug fix. Class params[] = new Class[2]; params[0] = String.class; @@ -617,21 +544,176 @@ void populateS3Reference_should_interact_with_getUrlDecodedKey() throws NoSuchMe final S3EventNotification.S3ObjectEntity s3ObjectEntity = mock(S3EventNotification.S3ObjectEntity.class); final S3EventNotification.S3BucketEntity s3BucketEntity = mock(S3EventNotification.S3BucketEntity.class); - when(s3EventNotificationRecord.getS3()).thenReturn(s3Entity); - when(s3Entity.getBucket()).thenReturn(s3BucketEntity); - when(s3Entity.getObject()).thenReturn(s3ObjectEntity); - when(s3BucketEntity.getName()).thenReturn("test-bucket-name"); - when(s3ObjectEntity.getUrlDecodedKey()).thenReturn("test-key"); - - final S3ObjectReference s3ObjectReference = (S3ObjectReference) method.invoke(sqsWorker, "test-bucket-name", "test-key"); + final S3ObjectReference s3ObjectReference = (S3ObjectReference) method.invoke(createObjectUnderTest(), "test-bucket-name", "test-key"); assertThat(s3ObjectReference, notNullValue()); assertThat(s3ObjectReference.getBucketName(), equalTo("test-bucket-name")); assertThat(s3ObjectReference.getKey(), equalTo("test-key")); -// verify(s3ObjectEntity).getUrlDecodedKey(); verifyNoMoreInteractions(s3ObjectEntity); } + + @ParameterizedTest + @ValueSource(strings = {"ObjectCreated:Put", "ObjectCreated:Post", "ObjectCreated:Copy", "ObjectCreated:CompleteMultipartUpload"}) + void processSqsMessages_should_return_number_of_messages_processed_with_acknowledgements(final String eventName) throws IOException { + when(acknowledgementSetManager.create(any(), any(Duration.class))).thenReturn(acknowledgementSet); + when(s3SourceConfig.getAcknowledgements()).thenReturn(true); + Instant startTime = Instant.now().minus(1, ChronoUnit.HOURS); + final Message message = mock(Message.class); + when(message.body()).thenReturn(createEventNotification(eventName, startTime)); + final String testReceiptHandle = UUID.randomUUID().toString(); + when(message.messageId()).thenReturn(testReceiptHandle); + when(message.receiptHandle()).thenReturn(testReceiptHandle); + + final ReceiveMessageResponse receiveMessageResponse = mock(ReceiveMessageResponse.class); + when(sqsClient.receiveMessage(any(ReceiveMessageRequest.class))).thenReturn(receiveMessageResponse); + when(receiveMessageResponse.messages()).thenReturn(Collections.singletonList(message)); + + final int messagesProcessed = createObjectUnderTest().processSqsMessages(); + final ArgumentCaptor deleteMessageBatchRequestArgumentCaptor = ArgumentCaptor.forClass(DeleteMessageBatchRequest.class); + + final ArgumentCaptor durationArgumentCaptor = ArgumentCaptor.forClass(Duration.class); + verify(sqsMessageDelayTimer).record(durationArgumentCaptor.capture()); + Duration actualDelay = durationArgumentCaptor.getValue(); + + assertThat(messagesProcessed, equalTo(1)); + verify(s3Service).addS3Object(any(S3ObjectReference.class), any()); + verify(acknowledgementSetManager).create(any(), any(Duration.class)); + verify(sqsMessagesReceivedCounter).increment(1); + verifyNoInteractions(sqsMessagesDeletedCounter); + assertThat(actualDelay, lessThanOrEqualTo(Duration.ofHours(1).plus(Duration.ofSeconds(5)))); + assertThat(actualDelay, greaterThanOrEqualTo(Duration.ofHours(1).minus(Duration.ofSeconds(5)))); + } + + @ParameterizedTest + @ValueSource(strings = {"ObjectCreated:Put", "ObjectCreated:Post", "ObjectCreated:Copy", "ObjectCreated:CompleteMultipartUpload"}) + void processSqsMessages_should_return_number_of_messages_processed_with_acknowledgements_and_progress_check(final String eventName) throws IOException { + when(sqsOptions.getVisibilityDuplicateProtection()).thenReturn(true); + when(sqsOptions.getVisibilityTimeout()).thenReturn(Duration.ofSeconds(6)); + when(acknowledgementSetManager.create(any(), any(Duration.class))).thenReturn(acknowledgementSet); + when(s3SourceConfig.getAcknowledgements()).thenReturn(true); + Instant startTime = Instant.now().minus(1, ChronoUnit.HOURS); + final Message message = mock(Message.class); + when(message.body()).thenReturn(createEventNotification(eventName, startTime)); + final String testReceiptHandle = UUID.randomUUID().toString(); + when(message.messageId()).thenReturn(testReceiptHandle); + when(message.receiptHandle()).thenReturn(testReceiptHandle); + + final ReceiveMessageResponse receiveMessageResponse = mock(ReceiveMessageResponse.class); + when(sqsClient.receiveMessage(any(ReceiveMessageRequest.class))).thenReturn(receiveMessageResponse); + when(receiveMessageResponse.messages()).thenReturn(Collections.singletonList(message)); + + final int messagesProcessed = createObjectUnderTest().processSqsMessages(); + final ArgumentCaptor deleteMessageBatchRequestArgumentCaptor = ArgumentCaptor.forClass(DeleteMessageBatchRequest.class); + + final ArgumentCaptor durationArgumentCaptor = ArgumentCaptor.forClass(Duration.class); + verify(sqsMessageDelayTimer).record(durationArgumentCaptor.capture()); + Duration actualDelay = durationArgumentCaptor.getValue(); + + assertThat(messagesProcessed, equalTo(1)); + verify(s3Service).addS3Object(any(S3ObjectReference.class), any()); + verify(acknowledgementSetManager).create(any(), any(Duration.class)); + verify(acknowledgementSet).addProgressCheck(any(), any(Duration.class)); + verify(sqsMessagesReceivedCounter).increment(1); + verifyNoInteractions(sqsMessagesDeletedCounter); + assertThat(actualDelay, lessThanOrEqualTo(Duration.ofHours(1).plus(Duration.ofSeconds(5)))); + assertThat(actualDelay, greaterThanOrEqualTo(Duration.ofHours(1).minus(Duration.ofSeconds(5)))); + } + + @ParameterizedTest + @ValueSource(strings = {"", "{\"foo\": \"bar\""}) + void processSqsMessages_should_not_interact_with_S3Service_if_input_is_not_valid_JSON(String inputString) { + final Message message = mock(Message.class); + when(message.body()).thenReturn(inputString); + + when(s3SourceConfig.getOnErrorOption()).thenReturn(OnErrorOption.RETAIN_MESSAGES); + + final ReceiveMessageResponse receiveMessageResponse = mock(ReceiveMessageResponse.class); + when(sqsClient.receiveMessage(any(ReceiveMessageRequest.class))).thenReturn(receiveMessageResponse); + when(receiveMessageResponse.messages()).thenReturn(Collections.singletonList(message)); + + final int messagesProcessed = createObjectUnderTest().processSqsMessages(); + assertThat(messagesProcessed, equalTo(1)); + verifyNoInteractions(s3Service); + verify(sqsClient, never()).deleteMessageBatch(any(DeleteMessageBatchRequest.class)); + verify(sqsMessagesReceivedCounter).increment(1); + verify(sqsMessagesFailedCounter).increment(); + } + + @Test + void processSqsMessages_should_update_visibility_timeout_when_progress_changes() throws IOException { + when(sqsOptions.getVisibilityDuplicateProtection()).thenReturn(true); + when(sqsOptions.getVisibilityTimeout()).thenReturn(Duration.ofMillis(1)); + when(acknowledgementSetManager.create(any(), any(Duration.class))).thenReturn(acknowledgementSet); + when(s3SourceConfig.getAcknowledgements()).thenReturn(true); + Instant startTime = Instant.now().minus(1, ChronoUnit.HOURS); + final Message message = mock(Message.class); + when(message.body()).thenReturn(createEventNotification("ObjectCreated:Put", startTime)); + final String testReceiptHandle = UUID.randomUUID().toString(); + when(message.messageId()).thenReturn(testReceiptHandle); + when(message.receiptHandle()).thenReturn(testReceiptHandle); + + final ReceiveMessageResponse receiveMessageResponse = mock(ReceiveMessageResponse.class); + when(sqsClient.receiveMessage(any(ReceiveMessageRequest.class))).thenReturn(receiveMessageResponse); + when(receiveMessageResponse.messages()).thenReturn(Collections.singletonList(message)); + + final int messagesProcessed = createObjectUnderTest().processSqsMessages(); + + assertThat(messagesProcessed, equalTo(1)); + verify(s3Service).addS3Object(any(S3ObjectReference.class), any()); + verify(acknowledgementSetManager).create(any(), any(Duration.class)); + + ArgumentCaptor> progressConsumerArgumentCaptor = ArgumentCaptor.forClass(Consumer.class); + verify(acknowledgementSet).addProgressCheck(progressConsumerArgumentCaptor.capture(), any(Duration.class)); + final Consumer actualConsumer = progressConsumerArgumentCaptor.getValue(); + final ProgressCheck progressCheck = mock(ProgressCheck.class); + actualConsumer.accept(progressCheck); + + ArgumentCaptor changeMessageVisibilityRequestArgumentCaptor = ArgumentCaptor.forClass(ChangeMessageVisibilityRequest.class); + verify(sqsClient).changeMessageVisibility(changeMessageVisibilityRequestArgumentCaptor.capture()); + ChangeMessageVisibilityRequest actualChangeVisibilityRequest = changeMessageVisibilityRequestArgumentCaptor.getValue(); + assertThat(actualChangeVisibilityRequest.queueUrl(), equalTo(queueUrl)); + assertThat(actualChangeVisibilityRequest.receiptHandle(), equalTo(testReceiptHandle)); + verify(sqsMessagesReceivedCounter).increment(1); + verify(sqsMessageDelayTimer).record(any(Duration.class)); + } + + @Test + void processSqsMessages_should_stop_updating_visibility_timeout_after_stop() throws IOException { + when(sqsOptions.getVisibilityDuplicateProtection()).thenReturn(true); + when(sqsOptions.getVisibilityTimeout()).thenReturn(Duration.ofMillis(1)); + when(acknowledgementSetManager.create(any(), any(Duration.class))).thenReturn(acknowledgementSet); + when(s3SourceConfig.getAcknowledgements()).thenReturn(true); + Instant startTime = Instant.now().minus(1, ChronoUnit.HOURS); + final Message message = mock(Message.class); + when(message.body()).thenReturn(createEventNotification("ObjectCreated:Put", startTime)); + final String testReceiptHandle = UUID.randomUUID().toString(); + when(message.messageId()).thenReturn(testReceiptHandle); + when(message.receiptHandle()).thenReturn(testReceiptHandle); + + final ReceiveMessageResponse receiveMessageResponse = mock(ReceiveMessageResponse.class); + when(sqsClient.receiveMessage(any(ReceiveMessageRequest.class))).thenReturn(receiveMessageResponse); + when(receiveMessageResponse.messages()).thenReturn(Collections.singletonList(message)); + + SqsWorker objectUnderTest = createObjectUnderTest(); + final int messagesProcessed = objectUnderTest.processSqsMessages(); + objectUnderTest.stop(); + + assertThat(messagesProcessed, equalTo(1)); + verify(s3Service).addS3Object(any(S3ObjectReference.class), any()); + verify(acknowledgementSetManager).create(any(), any(Duration.class)); + + ArgumentCaptor> progressConsumerArgumentCaptor = ArgumentCaptor.forClass(Consumer.class); + verify(acknowledgementSet).addProgressCheck(progressConsumerArgumentCaptor.capture(), any(Duration.class)); + final Consumer actualConsumer = progressConsumerArgumentCaptor.getValue(); + final ProgressCheck progressCheck = mock(ProgressCheck.class); + actualConsumer.accept(progressCheck); + + verify(sqsClient, never()).changeMessageVisibility(any(ChangeMessageVisibilityRequest.class)); + verify(sqsMessagesReceivedCounter).increment(1); + verify(sqsMessageDelayTimer).record(any(Duration.class)); + } + private static String createPutNotification(final Instant startTime) { return createEventNotification("ObjectCreated:Put", startTime); } diff --git a/data-prepper-plugins/s3-source/src/test/java/org/opensearch/dataprepper/plugins/source/s3/parser/ParsedMessageTest.java b/data-prepper-plugins/s3-source/src/test/java/org/opensearch/dataprepper/plugins/source/s3/parser/ParsedMessageTest.java index 3acec973e1..51f3abad06 100644 --- a/data-prepper-plugins/s3-source/src/test/java/org/opensearch/dataprepper/plugins/source/s3/parser/ParsedMessageTest.java +++ b/data-prepper-plugins/s3-source/src/test/java/org/opensearch/dataprepper/plugins/source/s3/parser/ParsedMessageTest.java @@ -2,6 +2,7 @@ import org.joda.time.DateTime; import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Nested; import org.junit.jupiter.api.Test; import org.opensearch.dataprepper.plugins.source.s3.S3EventBridgeNotification; import org.opensearch.dataprepper.plugins.source.s3.S3EventNotification; @@ -12,33 +13,31 @@ import java.util.UUID; import static org.hamcrest.MatcherAssert.assertThat; +import static org.hamcrest.Matchers.containsString; import static org.hamcrest.Matchers.equalTo; +import static org.hamcrest.Matchers.notNullValue; +import static org.junit.jupiter.api.Assertions.assertThrows; import static org.mockito.Mockito.mock; import static org.mockito.Mockito.when; class ParsedMessageTest { private static final Random RANDOM = new Random(); private Message message; - private S3EventNotification.S3Entity s3Entity; - private S3EventNotification.S3BucketEntity s3BucketEntity; - private S3EventNotification.S3ObjectEntity s3ObjectEntity; - private S3EventNotification.S3EventNotificationRecord s3EventNotificationRecord; - private S3EventBridgeNotification s3EventBridgeNotification; - private S3EventBridgeNotification.Detail detail; - private S3EventBridgeNotification.Bucket bucket; - private S3EventBridgeNotification.Object object; + private String testBucketName; + private String testDecodedObjectKey; + private long testSize; @BeforeEach void setUp() { message = mock(Message.class); - s3Entity = mock(S3EventNotification.S3Entity.class); - s3BucketEntity = mock(S3EventNotification.S3BucketEntity.class); - s3ObjectEntity = mock(S3EventNotification.S3ObjectEntity.class); - s3EventNotificationRecord = mock(S3EventNotification.S3EventNotificationRecord.class); - s3EventBridgeNotification = mock(S3EventBridgeNotification.class); - detail = mock(S3EventBridgeNotification.Detail.class); - bucket = mock(S3EventBridgeNotification.Bucket.class); - object = mock(S3EventBridgeNotification.Object.class); + testBucketName = UUID.randomUUID().toString(); + testDecodedObjectKey = UUID.randomUUID().toString(); + testSize = RANDOM.nextInt(1_000_000_000) + 1; + } + + @Test + void constructor_with_failed_parsing_throws_if_Message_is_null() { + assertThrows(NullPointerException.class, () -> new ParsedMessage(null, true)); } @Test @@ -50,61 +49,156 @@ void test_parsed_message_with_failed_parsing() { } @Test - void test_parsed_message_with_S3EventNotificationRecord() { - final String testBucketName = UUID.randomUUID().toString(); - final String testDecodedObjectKey = UUID.randomUUID().toString(); - final String testEventName = UUID.randomUUID().toString(); - final DateTime testEventTime = DateTime.now(); - final long testSize = RANDOM.nextLong(); - - when(s3EventNotificationRecord.getS3()).thenReturn(s3Entity); - when(s3Entity.getBucket()).thenReturn(s3BucketEntity); - when(s3Entity.getObject()).thenReturn(s3ObjectEntity); - when(s3ObjectEntity.getSizeAsLong()).thenReturn(testSize); - when(s3BucketEntity.getName()).thenReturn(testBucketName); - when(s3ObjectEntity.getUrlDecodedKey()).thenReturn(testDecodedObjectKey); - when(s3EventNotificationRecord.getEventName()).thenReturn(testEventName); - when(s3EventNotificationRecord.getEventTime()).thenReturn(testEventTime); - - final ParsedMessage parsedMessage = new ParsedMessage(message, List.of(s3EventNotificationRecord)); + void toString_with_failed_parsing_and_messageId() { + final String messageId = UUID.randomUUID().toString(); + when(message.messageId()).thenReturn(messageId); - assertThat(parsedMessage.getMessage(), equalTo(message)); - assertThat(parsedMessage.getBucketName(), equalTo(testBucketName)); - assertThat(parsedMessage.getObjectKey(), equalTo(testDecodedObjectKey)); - assertThat(parsedMessage.getObjectSize(), equalTo(testSize)); - assertThat(parsedMessage.getEventName(), equalTo(testEventName)); - assertThat(parsedMessage.getEventTime(), equalTo(testEventTime)); - assertThat(parsedMessage.isFailedParsing(), equalTo(false)); - assertThat(parsedMessage.isEmptyNotification(), equalTo(false)); + final ParsedMessage parsedMessage = new ParsedMessage(message, true); + final String actualString = parsedMessage.toString(); + assertThat(actualString, notNullValue()); + assertThat(actualString, containsString(messageId)); } @Test - void test_parsed_message_with_S3EventBridgeNotification() { - final String testBucketName = UUID.randomUUID().toString(); - final String testDecodedObjectKey = UUID.randomUUID().toString(); - final String testDetailType = UUID.randomUUID().toString(); - final DateTime testEventTime = DateTime.now(); - final int testSize = RANDOM.nextInt(); + void toString_with_failed_parsing_and_no_messageId() { + final ParsedMessage parsedMessage = new ParsedMessage(message, true); + final String actualString = parsedMessage.toString(); + assertThat(actualString, notNullValue()); + } - when(s3EventBridgeNotification.getDetail()).thenReturn(detail); - when(s3EventBridgeNotification.getDetail().getBucket()).thenReturn(bucket); - when(s3EventBridgeNotification.getDetail().getObject()).thenReturn(object); + @Nested + class WithS3EventNotificationRecord { + private S3EventNotification.S3Entity s3Entity; + private S3EventNotification.S3BucketEntity s3BucketEntity; + private S3EventNotification.S3ObjectEntity s3ObjectEntity; + private S3EventNotification.S3EventNotificationRecord s3EventNotificationRecord; + private List s3EventNotificationRecords; + private String testEventName; + private DateTime testEventTime; - when(bucket.getName()).thenReturn(testBucketName); - when(object.getUrlDecodedKey()).thenReturn(testDecodedObjectKey); - when(object.getSize()).thenReturn(testSize); - when(s3EventBridgeNotification.getDetailType()).thenReturn(testDetailType); - when(s3EventBridgeNotification.getTime()).thenReturn(testEventTime); + @BeforeEach + void setUp() { + testEventName = UUID.randomUUID().toString(); + testEventTime = DateTime.now(); - final ParsedMessage parsedMessage = new ParsedMessage(message, s3EventBridgeNotification); + s3Entity = mock(S3EventNotification.S3Entity.class); + s3BucketEntity = mock(S3EventNotification.S3BucketEntity.class); + s3ObjectEntity = mock(S3EventNotification.S3ObjectEntity.class); + s3EventNotificationRecord = mock(S3EventNotification.S3EventNotificationRecord.class); - assertThat(parsedMessage.getMessage(), equalTo(message)); - assertThat(parsedMessage.getBucketName(), equalTo(testBucketName)); - assertThat(parsedMessage.getObjectKey(), equalTo(testDecodedObjectKey)); - assertThat(parsedMessage.getObjectSize(), equalTo((long) testSize)); - assertThat(parsedMessage.getDetailType(), equalTo(testDetailType)); - assertThat(parsedMessage.getEventTime(), equalTo(testEventTime)); - assertThat(parsedMessage.isFailedParsing(), equalTo(false)); - assertThat(parsedMessage.isEmptyNotification(), equalTo(false)); + when(s3EventNotificationRecord.getS3()).thenReturn(s3Entity); + when(s3Entity.getBucket()).thenReturn(s3BucketEntity); + when(s3Entity.getObject()).thenReturn(s3ObjectEntity); + when(s3ObjectEntity.getSizeAsLong()).thenReturn(testSize); + when(s3BucketEntity.getName()).thenReturn(testBucketName); + when(s3ObjectEntity.getUrlDecodedKey()).thenReturn(testDecodedObjectKey); + when(s3EventNotificationRecord.getEventName()).thenReturn(testEventName); + when(s3EventNotificationRecord.getEventTime()).thenReturn(testEventTime); + + s3EventNotificationRecords = List.of(s3EventNotificationRecord); + } + + private ParsedMessage createObjectUnderTest() { + return new ParsedMessage(message, s3EventNotificationRecords); + } + + @Test + void constructor_with_S3EventNotificationRecord_throws_if_Message_is_null() { + message = null; + assertThrows(NullPointerException.class, this::createObjectUnderTest); + } + + @Test + void test_parsed_message_with_S3EventNotificationRecord() { + final ParsedMessage parsedMessage = createObjectUnderTest(); + + assertThat(parsedMessage.getMessage(), equalTo(message)); + assertThat(parsedMessage.getBucketName(), equalTo(testBucketName)); + assertThat(parsedMessage.getObjectKey(), equalTo(testDecodedObjectKey)); + assertThat(parsedMessage.getObjectSize(), equalTo(testSize)); + assertThat(parsedMessage.getEventName(), equalTo(testEventName)); + assertThat(parsedMessage.getEventTime(), equalTo(testEventTime)); + assertThat(parsedMessage.isFailedParsing(), equalTo(false)); + assertThat(parsedMessage.isEmptyNotification(), equalTo(false)); + } + + @Test + void toString_with_messageId() { + final String messageId = UUID.randomUUID().toString(); + when(message.messageId()).thenReturn(messageId); + + final ParsedMessage parsedMessage = createObjectUnderTest(); + final String actualString = parsedMessage.toString(); + assertThat(actualString, notNullValue()); + assertThat(actualString, containsString(messageId)); + assertThat(actualString, containsString(testDecodedObjectKey)); + } + } + + @Nested + class WithS3EventBridgeNotification { + private String testDetailType; + private DateTime testEventTime; + private S3EventBridgeNotification s3EventBridgeNotification; + private S3EventBridgeNotification.Detail detail; + private S3EventBridgeNotification.Bucket bucket; + private S3EventBridgeNotification.Object object; + + @BeforeEach + void setUp() { + s3EventBridgeNotification = mock(S3EventBridgeNotification.class); + detail = mock(S3EventBridgeNotification.Detail.class); + bucket = mock(S3EventBridgeNotification.Bucket.class); + object = mock(S3EventBridgeNotification.Object.class); + + testDetailType = UUID.randomUUID().toString(); + testEventTime = DateTime.now(); + + when(s3EventBridgeNotification.getDetail()).thenReturn(detail); + when(s3EventBridgeNotification.getDetail().getBucket()).thenReturn(bucket); + when(s3EventBridgeNotification.getDetail().getObject()).thenReturn(object); + + when(bucket.getName()).thenReturn(testBucketName); + when(object.getUrlDecodedKey()).thenReturn(testDecodedObjectKey); + when(object.getSize()).thenReturn((int) testSize); + when(s3EventBridgeNotification.getDetailType()).thenReturn(testDetailType); + when(s3EventBridgeNotification.getTime()).thenReturn(testEventTime); + } + + private ParsedMessage createObjectUnderTest() { + return new ParsedMessage(message, s3EventBridgeNotification); + } + + @Test + void constructor_with_S3EventBridgeNotification_throws_if_Message_is_null() { + message = null; + assertThrows(NullPointerException.class, () -> createObjectUnderTest()); + } + + @Test + void test_parsed_message_with_S3EventBridgeNotification() { + final ParsedMessage parsedMessage = createObjectUnderTest(); + + assertThat(parsedMessage.getMessage(), equalTo(message)); + assertThat(parsedMessage.getBucketName(), equalTo(testBucketName)); + assertThat(parsedMessage.getObjectKey(), equalTo(testDecodedObjectKey)); + assertThat(parsedMessage.getObjectSize(), equalTo(testSize)); + assertThat(parsedMessage.getDetailType(), equalTo(testDetailType)); + assertThat(parsedMessage.getEventTime(), equalTo(testEventTime)); + assertThat(parsedMessage.isFailedParsing(), equalTo(false)); + assertThat(parsedMessage.isEmptyNotification(), equalTo(false)); + } + + @Test + void toString_with_messageId() { + final String messageId = UUID.randomUUID().toString(); + when(message.messageId()).thenReturn(messageId); + + final ParsedMessage parsedMessage = createObjectUnderTest(); + final String actualString = parsedMessage.toString(); + assertThat(actualString, notNullValue()); + assertThat(actualString, containsString(messageId)); + assertThat(actualString, containsString(testDecodedObjectKey)); + } } } diff --git a/data-prepper-plugins/s3-source/src/test/java/org/opensearch/dataprepper/plugins/source/s3/parser/S3EventBridgeNotificationParserTest.java b/data-prepper-plugins/s3-source/src/test/java/org/opensearch/dataprepper/plugins/source/s3/parser/S3EventBridgeNotificationParserTest.java index c779ec561f..db361d70e1 100644 --- a/data-prepper-plugins/s3-source/src/test/java/org/opensearch/dataprepper/plugins/source/s3/parser/S3EventBridgeNotificationParserTest.java +++ b/data-prepper-plugins/s3-source/src/test/java/org/opensearch/dataprepper/plugins/source/s3/parser/S3EventBridgeNotificationParserTest.java @@ -19,7 +19,7 @@ class S3EventBridgeNotificationParserTest { private final ObjectMapper objectMapper = new ObjectMapper(); - private final String EVENTBRIDGE_MESSAGE = "{\"version\":\"0\",\"id\":\"17793124-05d4-b198-2fde-7ededc63b103\",\"detail-type\":\"Object Created\"," + + static final String EVENTBRIDGE_MESSAGE = "{\"version\":\"0\",\"id\":\"17793124-05d4-b198-2fde-7ededc63b103\",\"detail-type\":\"Object Created\"," + "\"source\":\"aws.s3\",\"account\":\"111122223333\",\"time\":\"2021-11-12T00:00:00Z\"," + "\"region\":\"ca-central-1\",\"resources\":[\"arn:aws:s3:::DOC-EXAMPLE-BUCKET1\"]," + "\"detail\":{\"version\":\"0\",\"bucket\":{\"name\":\"DOC-EXAMPLE-BUCKET1\"}," + diff --git a/data-prepper-plugins/s3-source/src/test/java/org/opensearch/dataprepper/plugins/source/s3/parser/S3EventNotificationParserTest.java b/data-prepper-plugins/s3-source/src/test/java/org/opensearch/dataprepper/plugins/source/s3/parser/S3EventNotificationParserTest.java index a3d2c91679..c9e3a39da8 100644 --- a/data-prepper-plugins/s3-source/src/test/java/org/opensearch/dataprepper/plugins/source/s3/parser/S3EventNotificationParserTest.java +++ b/data-prepper-plugins/s3-source/src/test/java/org/opensearch/dataprepper/plugins/source/s3/parser/S3EventNotificationParserTest.java @@ -16,8 +16,8 @@ import static org.mockito.Mockito.mock; import static org.mockito.Mockito.when; -class S3EventNotificationParserTest { - private static final String DIRECT_SQS_MESSAGE = +public class S3EventNotificationParserTest { + static final String DIRECT_SQS_MESSAGE = "{\"Records\":[{\"eventVersion\":\"2.1\",\"eventSource\":\"aws:s3\",\"awsRegion\":\"us-east-1\",\"eventTime\":\"2023-04-28T16:00:11.324Z\"," + "\"eventName\":\"ObjectCreated:Put\",\"userIdentity\":{\"principalId\":\"AWS:xyz\"},\"requestParameters\":{\"sourceIPAddress\":\"127.0.0.1\"}," + "\"responseElements\":{\"x-amz-request-id\":\"xyz\",\"x-amz-id-2\":\"xyz\"},\"s3\":{\"s3SchemaVersion\":\"1.0\"," + @@ -25,7 +25,7 @@ class S3EventNotificationParserTest { "\"arn\":\"arn:aws:s3:::my-bucket\"},\"object\":{\"key\":\"path/to/myfile.log.gz\",\"size\":3159112,\"eTag\":\"abcd123\"," + "\"sequencer\":\"000\"}}}]}"; - private static final String SNS_BASED_MESSAGE = "{\n" + + public static final String SNS_BASED_MESSAGE = "{\n" + " \"Type\" : \"Notification\",\n" + " \"MessageId\" : \"4e01e115-5b91-5096-8a74-bee95ed1e123\",\n" + " \"TopicArn\" : \"arn:aws:sns:us-east-1:123456789012:notifications\",\n" + diff --git a/data-prepper-plugins/s3-source/src/test/java/org/opensearch/dataprepper/plugins/source/s3/parser/SqsMessageParserTest.java b/data-prepper-plugins/s3-source/src/test/java/org/opensearch/dataprepper/plugins/source/s3/parser/SqsMessageParserTest.java new file mode 100644 index 0000000000..d0dd711f7e --- /dev/null +++ b/data-prepper-plugins/s3-source/src/test/java/org/opensearch/dataprepper/plugins/source/s3/parser/SqsMessageParserTest.java @@ -0,0 +1,96 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.source.s3.parser; + +import org.junit.jupiter.api.extension.ExtendWith; +import org.junit.jupiter.api.extension.ExtensionContext; +import org.junit.jupiter.params.ParameterizedTest; +import org.junit.jupiter.params.provider.Arguments; +import org.junit.jupiter.params.provider.ArgumentsProvider; +import org.junit.jupiter.params.provider.ArgumentsSource; +import org.mockito.Mock; +import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.dataprepper.plugins.source.s3.S3SourceConfig; +import org.opensearch.dataprepper.plugins.source.s3.configuration.NotificationSourceOption; +import software.amazon.awssdk.services.sqs.model.Message; + +import java.util.Collection; +import java.util.Collections; +import java.util.List; +import java.util.Set; +import java.util.stream.Collectors; +import java.util.stream.IntStream; +import java.util.stream.Stream; + +import static org.hamcrest.CoreMatchers.equalTo; +import static org.hamcrest.CoreMatchers.notNullValue; +import static org.hamcrest.MatcherAssert.assertThat; +import static org.hamcrest.Matchers.empty; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.when; + +@ExtendWith(MockitoExtension.class) +class SqsMessageParserTest { + @Mock + private S3SourceConfig s3SourceConfig; + + private SqsMessageParser createObjectUnderTest() { + return new SqsMessageParser(s3SourceConfig); + } + + @ParameterizedTest + @ArgumentsSource(SourceArgumentsProvider.class) + void parseSqsMessages_returns_empty_for_empty_messages(final NotificationSourceOption sourceOption) { + when(s3SourceConfig.getNotificationSource()).thenReturn(sourceOption); + final Collection parsedMessages = createObjectUnderTest().parseSqsMessages(Collections.emptyList()); + + assertThat(parsedMessages, notNullValue()); + assertThat(parsedMessages, empty()); + } + + @ParameterizedTest + @ArgumentsSource(SourceArgumentsProvider.class) + void parseSqsMessages_parsed_messages(final NotificationSourceOption sourceOption, + final String messageBody, + final String replacementString) { + when(s3SourceConfig.getNotificationSource()).thenReturn(sourceOption); + final int numberOfMessages = 10; + List messages = IntStream.range(0, numberOfMessages) + .mapToObj(i -> messageBody.replaceAll(replacementString, replacementString + i)) + .map(SqsMessageParserTest::createMockMessage) + .collect(Collectors.toList()); + final Collection parsedMessages = createObjectUnderTest().parseSqsMessages(messages); + + assertThat(parsedMessages, notNullValue()); + assertThat(parsedMessages.size(), equalTo(numberOfMessages)); + + final Set bucketNames = parsedMessages.stream().map(ParsedMessage::getBucketName).collect(Collectors.toSet()); + assertThat("The bucket names are unique, so the bucketNames should match the numberOfMessages.", + bucketNames.size(), equalTo(numberOfMessages)); + } + + static class SourceArgumentsProvider implements ArgumentsProvider { + @Override + public Stream provideArguments(final ExtensionContext extensionContext) { + return Stream.of( + Arguments.arguments( + NotificationSourceOption.S3, + S3EventNotificationParserTest.DIRECT_SQS_MESSAGE, + "my-bucket"), + Arguments.arguments( + NotificationSourceOption.EVENTBRIDGE, + S3EventBridgeNotificationParserTest.EVENTBRIDGE_MESSAGE, + "DOC-EXAMPLE-BUCKET1") + ); + } + } + + private static Message createMockMessage(final String body) { + final Message message = mock(Message.class); + when(message.body()).thenReturn(body); + return message; + } +} \ No newline at end of file diff --git a/data-prepper-plugins/s3-source/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker b/data-prepper-plugins/s3-source/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker deleted file mode 100644 index 23c33feb6d..0000000000 --- a/data-prepper-plugins/s3-source/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker +++ /dev/null @@ -1,3 +0,0 @@ -# To enable mocking of final classes with vanilla Mockito -# https://github.com/mockito/mockito/wiki/What%27s-new-in-Mockito-2#mock-the-unmockable-opt-in-mocking-of-final-classesmethods -mock-maker-inline diff --git a/data-prepper-plugins/service-map-stateful/build.gradle b/data-prepper-plugins/service-map-stateful/build.gradle index 60b9512ed9..ab2300f020 100644 --- a/data-prepper-plugins/service-map-stateful/build.gradle +++ b/data-prepper-plugins/service-map-stateful/build.gradle @@ -19,7 +19,7 @@ dependencies { exclude group: 'com.google.protobuf', module: 'protobuf-java' } implementation libs.protobuf.core - testImplementation testLibs.mockito.inline + testImplementation project(':data-prepper-test-common') } jacocoTestCoverageVerification { diff --git a/data-prepper-plugins/service-map-stateful/src/main/java/org/opensearch/dataprepper/plugins/processor/ServiceMapProcessorConfig.java b/data-prepper-plugins/service-map-stateful/src/main/java/org/opensearch/dataprepper/plugins/processor/ServiceMapProcessorConfig.java index 8c337b2737..7f72fb5286 100644 --- a/data-prepper-plugins/service-map-stateful/src/main/java/org/opensearch/dataprepper/plugins/processor/ServiceMapProcessorConfig.java +++ b/data-prepper-plugins/service-map-stateful/src/main/java/org/opensearch/dataprepper/plugins/processor/ServiceMapProcessorConfig.java @@ -5,8 +5,20 @@ package org.opensearch.dataprepper.plugins.processor; +import com.fasterxml.jackson.annotation.JsonProperty; +import com.fasterxml.jackson.annotation.JsonPropertyDescription; + public class ServiceMapProcessorConfig { - static final String WINDOW_DURATION = "window_duration"; + private static final String WINDOW_DURATION = "window_duration"; static final int DEFAULT_WINDOW_DURATION = 180; static final String DEFAULT_DB_PATH = "data/service-map/"; + + @JsonProperty(WINDOW_DURATION) + @JsonPropertyDescription("Represents the fixed time window, in seconds, " + + "during which service map relationships are evaluated. Default value is 180.") + private int windowDuration = DEFAULT_WINDOW_DURATION; + + public int getWindowDuration() { + return windowDuration; + } } diff --git a/data-prepper-plugins/service-map-stateful/src/main/java/org/opensearch/dataprepper/plugins/processor/ServiceMapStatefulProcessor.java b/data-prepper-plugins/service-map-stateful/src/main/java/org/opensearch/dataprepper/plugins/processor/ServiceMapStatefulProcessor.java index c02ccb17d6..75041a09b4 100644 --- a/data-prepper-plugins/service-map-stateful/src/main/java/org/opensearch/dataprepper/plugins/processor/ServiceMapStatefulProcessor.java +++ b/data-prepper-plugins/service-map-stateful/src/main/java/org/opensearch/dataprepper/plugins/processor/ServiceMapStatefulProcessor.java @@ -6,9 +6,11 @@ package org.opensearch.dataprepper.plugins.processor; import org.apache.commons.codec.DecoderException; +import org.opensearch.dataprepper.metrics.PluginMetrics; import org.opensearch.dataprepper.model.annotations.DataPrepperPlugin; +import org.opensearch.dataprepper.model.annotations.DataPrepperPluginConstructor; import org.opensearch.dataprepper.model.annotations.SingleThread; -import org.opensearch.dataprepper.model.configuration.PluginSetting; +import org.opensearch.dataprepper.model.configuration.PipelineDescription; import org.opensearch.dataprepper.model.event.Event; import org.opensearch.dataprepper.model.event.JacksonEvent; import org.opensearch.dataprepper.model.peerforwarder.RequiresPeerForwarding; @@ -40,7 +42,8 @@ import java.util.concurrent.atomic.AtomicInteger; @SingleThread -@DataPrepperPlugin(name = "service_map", deprecatedName = "service_map_stateful", pluginType = Processor.class) +@DataPrepperPlugin(name = "service_map", deprecatedName = "service_map_stateful", pluginType = Processor.class, + pluginConfigurationType = ServiceMapProcessorConfig.class) public class ServiceMapStatefulProcessor extends AbstractProcessor, Record> implements RequiresPeerForwarding { static final String SPANS_DB_SIZE = "spansDbSize"; @@ -75,20 +78,24 @@ public class ServiceMapStatefulProcessor extends AbstractProcessor private final int thisProcessorId; - public ServiceMapStatefulProcessor(final PluginSetting pluginSetting) { - this(pluginSetting.getIntegerOrDefault(ServiceMapProcessorConfig.WINDOW_DURATION, ServiceMapProcessorConfig.DEFAULT_WINDOW_DURATION) * TO_MILLIS, + @DataPrepperPluginConstructor + public ServiceMapStatefulProcessor( + final ServiceMapProcessorConfig serviceMapProcessorConfig, + final PluginMetrics pluginMetrics, + final PipelineDescription pipelineDescription) { + this((long) serviceMapProcessorConfig.getWindowDuration() * TO_MILLIS, new File(ServiceMapProcessorConfig.DEFAULT_DB_PATH), Clock.systemUTC(), - pluginSetting.getNumberOfProcessWorkers(), - pluginSetting); + pipelineDescription.getNumberOfProcessWorkers(), + pluginMetrics); } - public ServiceMapStatefulProcessor(final long windowDurationMillis, + ServiceMapStatefulProcessor(final long windowDurationMillis, final File databasePath, final Clock clock, final int processWorkers, - final PluginSetting pluginSetting) { - super(pluginSetting); + final PluginMetrics pluginMetrics) { + super(pluginMetrics); ServiceMapStatefulProcessor.clock = clock; this.thisProcessorId = processorsCreated.getAndIncrement(); diff --git a/data-prepper-plugins/service-map-stateful/src/test/java/org/opensearch/dataprepper/plugins/processor/ServiceMapProcessorConfigTest.java b/data-prepper-plugins/service-map-stateful/src/test/java/org/opensearch/dataprepper/plugins/processor/ServiceMapProcessorConfigTest.java new file mode 100644 index 0000000000..35ef3b0c07 --- /dev/null +++ b/data-prepper-plugins/service-map-stateful/src/test/java/org/opensearch/dataprepper/plugins/processor/ServiceMapProcessorConfigTest.java @@ -0,0 +1,38 @@ +package org.opensearch.dataprepper.plugins.processor; + +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Test; +import org.opensearch.dataprepper.test.helper.ReflectivelySetField; + +import java.util.Random; + +import static org.hamcrest.CoreMatchers.equalTo; +import static org.hamcrest.MatcherAssert.assertThat; +import static org.opensearch.dataprepper.plugins.processor.ServiceMapProcessorConfig.DEFAULT_WINDOW_DURATION; + +class ServiceMapProcessorConfigTest { + private ServiceMapProcessorConfig serviceMapProcessorConfig; + Random random; + + @BeforeEach + void setUp() { + serviceMapProcessorConfig = new ServiceMapProcessorConfig(); + random = new Random(); + } + + @Test + void testDefaultConfig() { + assertThat(serviceMapProcessorConfig.getWindowDuration(), equalTo(DEFAULT_WINDOW_DURATION)); + } + + @Test + void testGetter() throws NoSuchFieldException, IllegalAccessException { + final int windowDuration = 1 + random.nextInt(300); + ReflectivelySetField.setField( + ServiceMapProcessorConfig.class, + serviceMapProcessorConfig, + "windowDuration", + windowDuration); + assertThat(serviceMapProcessorConfig.getWindowDuration(), equalTo(windowDuration)); + } +} \ No newline at end of file diff --git a/data-prepper-plugins/service-map-stateful/src/test/java/org/opensearch/dataprepper/plugins/processor/ServiceMapStatefulProcessorTest.java b/data-prepper-plugins/service-map-stateful/src/test/java/org/opensearch/dataprepper/plugins/processor/ServiceMapStatefulProcessorTest.java index 28789615aa..b565642e19 100644 --- a/data-prepper-plugins/service-map-stateful/src/test/java/org/opensearch/dataprepper/plugins/processor/ServiceMapStatefulProcessorTest.java +++ b/data-prepper-plugins/service-map-stateful/src/test/java/org/opensearch/dataprepper/plugins/processor/ServiceMapStatefulProcessorTest.java @@ -14,6 +14,8 @@ import org.mockito.Mockito; import org.opensearch.dataprepper.metrics.MetricNames; import org.opensearch.dataprepper.metrics.MetricsTestUtil; +import org.opensearch.dataprepper.metrics.PluginMetrics; +import org.opensearch.dataprepper.model.configuration.PipelineDescription; import org.opensearch.dataprepper.model.configuration.PluginSetting; import org.opensearch.dataprepper.model.record.Record; import org.opensearch.dataprepper.model.trace.Span; @@ -43,6 +45,7 @@ import static org.junit.jupiter.api.Assertions.assertTrue; import static org.mockito.Mockito.mock; import static org.mockito.Mockito.when; +import static org.opensearch.dataprepper.plugins.processor.ServiceMapProcessorConfig.DEFAULT_WINDOW_DURATION; public class ServiceMapStatefulProcessorTest { @@ -54,12 +57,20 @@ public class ServiceMapStatefulProcessorTest { private static final String PAYMENT_SERVICE = "PAY"; private static final String CART_SERVICE = "CART"; private PluginSetting pluginSetting; + private PluginMetrics pluginMetrics; + private PipelineDescription pipelineDescription; + private ServiceMapProcessorConfig serviceMapProcessorConfig; @BeforeEach public void setup() throws NoSuchFieldException, IllegalAccessException { resetServiceMapStatefulProcessorStatic(); MetricsTestUtil.initMetrics(); pluginSetting = mock(PluginSetting.class); + pipelineDescription = mock(PipelineDescription.class); + serviceMapProcessorConfig = mock(ServiceMapProcessorConfig.class); + when(serviceMapProcessorConfig.getWindowDuration()).thenReturn(DEFAULT_WINDOW_DURATION); + pluginMetrics = PluginMetrics.fromNames( + "testServiceMapProcessor", "testPipelineName"); when(pluginSetting.getName()).thenReturn("testServiceMapProcessor"); when(pluginSetting.getPipelineName()).thenReturn("testPipelineName"); } @@ -116,13 +127,11 @@ private Set evaluateEdges(Set serv } @Test - public void testPluginSettingConstructor() { - - final PluginSetting pluginSetting = new PluginSetting("testPluginSetting", Collections.emptyMap()); - pluginSetting.setProcessWorkers(4); - pluginSetting.setPipelineName("TestPipeline"); + public void testDataPrepperConstructor() { + when(pipelineDescription.getNumberOfProcessWorkers()).thenReturn(4); //Nothing is accessible to validate, so just verify that no exception is thrown. - final ServiceMapStatefulProcessor serviceMapStatefulProcessor = new ServiceMapStatefulProcessor(pluginSetting); + final ServiceMapStatefulProcessor serviceMapStatefulProcessor = new ServiceMapStatefulProcessor( + serviceMapProcessorConfig, pluginMetrics, pipelineDescription); } @Test @@ -132,8 +141,8 @@ public void testTraceGroupsWithEventRecordData() throws Exception { Mockito.when(clock.instant()).thenReturn(Instant.now()); ExecutorService threadpool = Executors.newCachedThreadPool(); final File path = new File(ServiceMapProcessorConfig.DEFAULT_DB_PATH); - final ServiceMapStatefulProcessor serviceMapStateful1 = new ServiceMapStatefulProcessor(100, path, clock, 2, pluginSetting); - final ServiceMapStatefulProcessor serviceMapStateful2 = new ServiceMapStatefulProcessor(100, path, clock, 2, pluginSetting); + final ServiceMapStatefulProcessor serviceMapStateful1 = new ServiceMapStatefulProcessor(100, path, clock, 2, pluginMetrics); + final ServiceMapStatefulProcessor serviceMapStateful2 = new ServiceMapStatefulProcessor(100, path, clock, 2, pluginMetrics); final byte[] rootSpanId1Bytes = ServiceMapTestUtils.getRandomBytes(8); final byte[] rootSpanId2Bytes = ServiceMapTestUtils.getRandomBytes(8); @@ -327,8 +336,8 @@ public void testTraceGroupsWithIsolatedServiceEventRecordData() throws Exception Mockito.when(clock.instant()).thenReturn(Instant.now()); ExecutorService threadpool = Executors.newCachedThreadPool(); final File path = new File(ServiceMapProcessorConfig.DEFAULT_DB_PATH); - final ServiceMapStatefulProcessor serviceMapStateful1 = new ServiceMapStatefulProcessor(100, path, clock, 2, pluginSetting); - final ServiceMapStatefulProcessor serviceMapStateful2 = new ServiceMapStatefulProcessor(100, path, clock, 2, pluginSetting); + final ServiceMapStatefulProcessor serviceMapStateful1 = new ServiceMapStatefulProcessor(100, path, clock, 2, pluginMetrics); + final ServiceMapStatefulProcessor serviceMapStateful2 = new ServiceMapStatefulProcessor(100, path, clock, 2, pluginMetrics); final byte[] rootSpanIdBytes = ServiceMapTestUtils.getRandomBytes(8); final byte[] traceIdBytes = ServiceMapTestUtils.getRandomBytes(16); @@ -383,7 +392,7 @@ public void testTraceGroupsWithIsolatedServiceEventRecordData() throws Exception @Test public void testPrepareForShutdownWithEventRecordData() { final File path = new File(ServiceMapProcessorConfig.DEFAULT_DB_PATH); - final ServiceMapStatefulProcessor serviceMapStateful = new ServiceMapStatefulProcessor(100, path, Clock.systemUTC(), 1, pluginSetting); + final ServiceMapStatefulProcessor serviceMapStateful = new ServiceMapStatefulProcessor(100, path, Clock.systemUTC(), 1, pluginMetrics); final byte[] rootSpanId1Bytes = ServiceMapTestUtils.getRandomBytes(8); final byte[] traceId1Bytes = ServiceMapTestUtils.getRandomBytes(16); @@ -411,11 +420,9 @@ public void testPrepareForShutdownWithEventRecordData() { @Test public void testGetIdentificationKeys() { - final PluginSetting pluginSetting = new PluginSetting("testPluginSetting", Collections.emptyMap()); - pluginSetting.setProcessWorkers(4); - pluginSetting.setPipelineName("TestPipeline"); - - final ServiceMapStatefulProcessor serviceMapStatefulProcessor = new ServiceMapStatefulProcessor(pluginSetting); + when(pipelineDescription.getNumberOfProcessWorkers()).thenReturn(4); + final ServiceMapStatefulProcessor serviceMapStatefulProcessor = new ServiceMapStatefulProcessor( + serviceMapProcessorConfig, pluginMetrics, pipelineDescription); final Collection expectedIdentificationKeys = serviceMapStatefulProcessor.getIdentificationKeys(); assertThat(expectedIdentificationKeys, equalTo(Collections.singleton("traceId"))); diff --git a/data-prepper-plugins/sns-sink/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker b/data-prepper-plugins/sns-sink/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker deleted file mode 100644 index 23c33feb6d..0000000000 --- a/data-prepper-plugins/sns-sink/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker +++ /dev/null @@ -1,3 +0,0 @@ -# To enable mocking of final classes with vanilla Mockito -# https://github.com/mockito/mockito/wiki/What%27s-new-in-Mockito-2#mock-the-unmockable-opt-in-mocking-of-final-classesmethods -mock-maker-inline diff --git a/data-prepper-plugins/split-event-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/splitevent/SplitEventProcessorConfig.java b/data-prepper-plugins/split-event-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/splitevent/SplitEventProcessorConfig.java index c4af96a3d4..db70e3c6db 100644 --- a/data-prepper-plugins/split-event-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/splitevent/SplitEventProcessorConfig.java +++ b/data-prepper-plugins/split-event-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/splitevent/SplitEventProcessorConfig.java @@ -11,6 +11,7 @@ package org.opensearch.dataprepper.plugins.processor.splitevent; import com.fasterxml.jackson.annotation.JsonProperty; +import com.fasterxml.jackson.annotation.JsonPropertyDescription; import jakarta.validation.constraints.NotEmpty; import jakarta.validation.constraints.NotNull; import jakarta.validation.constraints.Size; @@ -20,12 +21,15 @@ public class SplitEventProcessorConfig { @NotEmpty @NotNull @JsonProperty("field") + @JsonPropertyDescription("The event field to be split") private String field; @JsonProperty("delimiter_regex") + @JsonPropertyDescription("The regular expression used as the delimiter for splitting the field") private String delimiterRegex; @Size(min = 1, max = 1) + @JsonPropertyDescription("The delimiter used for splitting the field. If not specified, the default delimiter is used") private String delimiter; public String getField() { diff --git a/data-prepper-plugins/split-event-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/splitevent/SplitEventProcessorTest.java b/data-prepper-plugins/split-event-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/splitevent/SplitEventProcessorTest.java index 7fc126fdf5..4e8944ab91 100644 --- a/data-prepper-plugins/split-event-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/splitevent/SplitEventProcessorTest.java +++ b/data-prepper-plugins/split-event-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/splitevent/SplitEventProcessorTest.java @@ -69,7 +69,7 @@ private Record createTestRecord(final Map data) { DefaultEventHandle eventHandle = (DefaultEventHandle) event.getEventHandle(); - eventHandle.setAcknowledgementSet(mockAcknowledgementSet); + eventHandle.addAcknowledgementSet(mockAcknowledgementSet); return new Record<>(event); } diff --git a/data-prepper-plugins/sqs-source/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker b/data-prepper-plugins/sqs-source/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker deleted file mode 100644 index 23c33feb6d..0000000000 --- a/data-prepper-plugins/sqs-source/src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker +++ /dev/null @@ -1,3 +0,0 @@ -# To enable mocking of final classes with vanilla Mockito -# https://github.com/mockito/mockito/wiki/What%27s-new-in-Mockito-2#mock-the-unmockable-opt-in-mocking-of-final-classesmethods -mock-maker-inline diff --git a/data-prepper-plugins/truncate-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/truncate/TruncateProcessorConfig.java b/data-prepper-plugins/truncate-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/truncate/TruncateProcessorConfig.java index 7fde949719..02c83f5773 100644 --- a/data-prepper-plugins/truncate-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/truncate/TruncateProcessorConfig.java +++ b/data-prepper-plugins/truncate-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/truncate/TruncateProcessorConfig.java @@ -6,6 +6,7 @@ package org.opensearch.dataprepper.plugins.processor.truncate; import com.fasterxml.jackson.annotation.JsonProperty; +import com.fasterxml.jackson.annotation.JsonPropertyDescription; import jakarta.validation.constraints.NotEmpty; import jakarta.validation.constraints.NotNull; import jakarta.validation.constraints.AssertTrue; @@ -16,18 +17,25 @@ public class TruncateProcessorConfig { public static class Entry { @JsonProperty("source_keys") + @JsonPropertyDescription("The list of source keys that will be modified by the processor. " + + "The default value is an empty list, which indicates that all values will be truncated.") private List sourceKeys; @JsonProperty("start_at") + @JsonPropertyDescription("Where in the string value to start truncation. " + + "Default is `0`, which specifies to start truncation at the beginning of each key's value.") private Integer startAt; @JsonProperty("length") + @JsonPropertyDescription("The length of the string after truncation. " + + "When not specified, the processor will measure the length based on where the string ends.") private Integer length; @JsonProperty("recursive") private Boolean recurse = false; @JsonProperty("truncate_when") + @JsonPropertyDescription("A condition that, when met, determines when the truncate operation is performed.") private String truncateWhen; public Entry(final List sourceKeys, final Integer startAt, final Integer length, final String truncateWhen, final Boolean recurse) { @@ -77,6 +85,7 @@ public boolean isValidConfig() { @NotEmpty @NotNull + @JsonPropertyDescription("A list of entries to add to an event.") private List<@Valid Entry> entries; public List getEntries() { diff --git a/data-prepper-plugins/user-agent-processor/build.gradle b/data-prepper-plugins/user-agent-processor/build.gradle index 6ad33c84ba..5e92b158f5 100644 --- a/data-prepper-plugins/user-agent-processor/build.gradle +++ b/data-prepper-plugins/user-agent-processor/build.gradle @@ -11,7 +11,9 @@ dependencies { implementation project(':data-prepper-api') implementation project(':data-prepper-plugins:common') implementation 'com.fasterxml.jackson.core:jackson-databind' - implementation "com.github.ua-parser:uap-java:1.6.1" + implementation 'com.github.ua-parser:uap-java:1.6.1' + implementation libs.caffeine + testImplementation project(':data-prepper-test-event') } jacocoTestCoverageVerification { diff --git a/data-prepper-plugins/user-agent-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/useragent/CaffeineCachingParser.java b/data-prepper-plugins/user-agent-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/useragent/CaffeineCachingParser.java new file mode 100644 index 0000000000..45a96bd909 --- /dev/null +++ b/data-prepper-plugins/user-agent-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/useragent/CaffeineCachingParser.java @@ -0,0 +1,75 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.processor.useragent; + +import com.github.benmanes.caffeine.cache.Cache; +import com.github.benmanes.caffeine.cache.Caffeine; +import ua_parser.Client; +import ua_parser.Device; +import ua_parser.OS; +import ua_parser.Parser; +import ua_parser.UserAgent; + +import java.util.function.Function; + +/** + * A superclass of {@link Parser} which uses Caffeine as a cache. + */ +class CaffeineCachingParser extends Parser { + private final Cache clientCache; + private final Cache userAgentCache; + private final Cache deviceCache; + private final Cache osCache; + + /** + * Constructs a new instance with a given cache size. Each parse method + * will have its own cache. + * + * @param cacheSize The size of the cache as a count of items. + */ + CaffeineCachingParser(final long cacheSize) { + userAgentCache = createCache(cacheSize); + clientCache = createCache(cacheSize); + deviceCache = createCache(cacheSize); + osCache = createCache(cacheSize); + } + + @Override + public Client parse(final String agentString) { + return parseCaching(agentString, clientCache, super::parse); + } + + @Override + public UserAgent parseUserAgent(final String agentString) { + return parseCaching(agentString, userAgentCache, super::parseUserAgent); + } + + @Override + public Device parseDevice(final String agentString) { + return parseCaching(agentString, deviceCache, super::parseDevice); + } + + @Override + public OS parseOS(final String agentString) { + return parseCaching(agentString, osCache, super::parseOS); + } + + private T parseCaching( + final String agentString, + final Cache cache, + final Function parseFunction) { + if (agentString == null) { + return null; + } + return cache.get(agentString, parseFunction); + } + + private static Cache createCache(final long maximumSize) { + return Caffeine.newBuilder() + .maximumSize(maximumSize) + .build(); + } +} diff --git a/data-prepper-plugins/user-agent-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/useragent/UserAgentProcessor.java b/data-prepper-plugins/user-agent-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/useragent/UserAgentProcessor.java index 220bb88287..c84b308645 100644 --- a/data-prepper-plugins/user-agent-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/useragent/UserAgentProcessor.java +++ b/data-prepper-plugins/user-agent-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/useragent/UserAgentProcessor.java @@ -9,12 +9,13 @@ import org.opensearch.dataprepper.model.annotations.DataPrepperPlugin; import org.opensearch.dataprepper.model.annotations.DataPrepperPluginConstructor; import org.opensearch.dataprepper.model.event.Event; +import org.opensearch.dataprepper.model.event.EventKey; +import org.opensearch.dataprepper.model.event.EventKeyFactory; import org.opensearch.dataprepper.model.processor.AbstractProcessor; import org.opensearch.dataprepper.model.processor.Processor; import org.opensearch.dataprepper.model.record.Record; import org.slf4j.Logger; import org.slf4j.LoggerFactory; -import ua_parser.CachingParser; import ua_parser.Client; import ua_parser.Parser; @@ -31,12 +32,19 @@ public class UserAgentProcessor extends AbstractProcessor, Record< private static final Logger LOG = LoggerFactory.getLogger(UserAgentProcessor.class); private final UserAgentProcessorConfig config; private final Parser userAgentParser; + private final EventKey sourceKey; + private final EventKey targetKey; @DataPrepperPluginConstructor - public UserAgentProcessor(final PluginMetrics pluginMetrics, final UserAgentProcessorConfig config) { + public UserAgentProcessor( + final UserAgentProcessorConfig config, + final EventKeyFactory eventKeyFactory, + final PluginMetrics pluginMetrics) { super(pluginMetrics); this.config = config; - this.userAgentParser = new CachingParser(config.getCacheSize()); + this.userAgentParser = new CaffeineCachingParser(config.getCacheSize()); + this.sourceKey = config.getSource(); + this.targetKey = eventKeyFactory.createEventKey(config.getTarget(), EventKeyFactory.EventAction.PUT); } @Override @@ -45,7 +53,7 @@ public Collection> doExecute(final Collection> recor final Event event = record.getData(); try { - final String userAgentStr = event.get(config.getSource(), String.class); + final String userAgentStr = event.get(sourceKey, String.class); Objects.requireNonNull(userAgentStr); final Client clientInfo = this.userAgentParser.parse(userAgentStr); @@ -54,10 +62,10 @@ public Collection> doExecute(final Collection> recor if (!config.getExcludeOriginal()) { parsedUserAgent.put("original", userAgentStr); } - event.put(config.getTarget(), parsedUserAgent); + event.put(targetKey, parsedUserAgent); } catch (Exception e) { LOG.error(EVENT, "An exception occurred when parsing user agent data from event [{}] with source key [{}]", - event, config.getSource(), e); + event, sourceKey, e); final List tagsOnParseFailure = config.getTagsOnParseFailure(); if (Objects.nonNull(tagsOnParseFailure) && tagsOnParseFailure.size() > 0) { diff --git a/data-prepper-plugins/user-agent-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/useragent/UserAgentProcessorConfig.java b/data-prepper-plugins/user-agent-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/useragent/UserAgentProcessorConfig.java index e62fc5a2da..0dcf46e2a1 100644 --- a/data-prepper-plugins/user-agent-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/useragent/UserAgentProcessorConfig.java +++ b/data-prepper-plugins/user-agent-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/useragent/UserAgentProcessorConfig.java @@ -8,6 +8,9 @@ import com.fasterxml.jackson.annotation.JsonProperty; import jakarta.validation.constraints.NotEmpty; import jakarta.validation.constraints.NotNull; +import org.opensearch.dataprepper.model.event.EventKey; +import org.opensearch.dataprepper.model.event.EventKeyConfiguration; +import org.opensearch.dataprepper.model.event.EventKeyFactory; import java.util.List; @@ -18,7 +21,8 @@ public class UserAgentProcessorConfig { @NotEmpty @NotNull @JsonProperty("source") - private String source; + @EventKeyConfiguration(EventKeyFactory.EventAction.GET) + private EventKey source; @NotNull @JsonProperty("target") @@ -34,7 +38,7 @@ public class UserAgentProcessorConfig { @JsonProperty("tags_on_parse_failure") private List tagsOnParseFailure; - public String getSource() { + public EventKey getSource() { return source; } diff --git a/data-prepper-plugins/user-agent-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/useragent/CaffeineCachingParserTest.java b/data-prepper-plugins/user-agent-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/useragent/CaffeineCachingParserTest.java new file mode 100644 index 0000000000..14c72fa354 --- /dev/null +++ b/data-prepper-plugins/user-agent-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/useragent/CaffeineCachingParserTest.java @@ -0,0 +1,152 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.plugins.processor.useragent; + +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Test; +import ua_parser.Client; +import ua_parser.Device; +import ua_parser.OS; +import ua_parser.UserAgent; + +import static org.hamcrest.CoreMatchers.equalTo; +import static org.hamcrest.CoreMatchers.notNullValue; +import static org.hamcrest.CoreMatchers.nullValue; +import static org.hamcrest.CoreMatchers.sameInstance; +import static org.hamcrest.MatcherAssert.assertThat; + +@SuppressWarnings("StringOperationCanBeSimplified") +class CaffeineCachingParserTest { + private static final String KNOWN_USER_AGENT_STRING = "Mozilla/5.0 (iPhone; CPU iPhone OS 13_5_1 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/13.1.1 Mobile/15E148 Safari/604.1"; + long cacheSize; + + @BeforeEach + void setUp() { + cacheSize = 1000; + } + + private CaffeineCachingParser createObjectUnderTest() { + return new CaffeineCachingParser(cacheSize); + } + + @Test + void parse_returns_expected_results() { + final Client client = createObjectUnderTest().parse(KNOWN_USER_AGENT_STRING); + + assertThat(client, notNullValue()); + assertThat(client.userAgent, notNullValue()); + assertThat(client.userAgent.family, equalTo("Mobile Safari")); + assertThat(client.userAgent.major, equalTo("13")); + assertThat(client.device.family, equalTo("iPhone")); + assertThat(client.os.family, equalTo("iOS")); + } + + @Test + void parse_with_null_returns_null() { + assertThat(createObjectUnderTest().parse(null), + nullValue()); + } + + @Test + void parse_called_multiple_times_returns_same_instance() { + final CaffeineCachingParser objectUnderTest = createObjectUnderTest(); + + final String userAgentString = KNOWN_USER_AGENT_STRING; + final Client client = objectUnderTest.parse(userAgentString); + + assertThat(client, notNullValue()); + + assertThat(objectUnderTest.parse(new String(userAgentString)), sameInstance(client)); + assertThat(objectUnderTest.parse(new String(userAgentString)), sameInstance(client)); + assertThat(objectUnderTest.parse(new String(userAgentString)), sameInstance(client)); + } + + @Test + void parseUserAgent_returns_expected_results() { + final UserAgent userAgent = createObjectUnderTest().parseUserAgent(KNOWN_USER_AGENT_STRING); + + assertThat(userAgent, notNullValue()); + assertThat(userAgent.family, equalTo("Mobile Safari")); + assertThat(userAgent.major, equalTo("13")); + } + + @Test + void parseUserAgent_with_null_returns_null() { + assertThat(createObjectUnderTest().parseUserAgent(null), + nullValue()); + } + + @Test + void parseUserAgent_called_multiple_times_returns_same_instance() { + final CaffeineCachingParser objectUnderTest = createObjectUnderTest(); + + final String userAgentString = KNOWN_USER_AGENT_STRING; + final UserAgent userAgent = objectUnderTest.parseUserAgent(userAgentString); + + assertThat(userAgent, notNullValue()); + + assertThat(objectUnderTest.parseUserAgent(new String(userAgentString)), sameInstance(userAgent)); + assertThat(objectUnderTest.parseUserAgent(new String(userAgentString)), sameInstance(userAgent)); + assertThat(objectUnderTest.parseUserAgent(new String(userAgentString)), sameInstance(userAgent)); + } + + @Test + void parseDevice_returns_expected_results() { + final Device device = createObjectUnderTest().parseDevice(KNOWN_USER_AGENT_STRING); + + assertThat(device, notNullValue()); + assertThat(device.family, equalTo("iPhone")); + } + + @Test + void parseDevice_with_null_returns_null() { + assertThat(createObjectUnderTest().parseDevice(null), + nullValue()); + } + + @Test + void parseDevice_called_multiple_times_returns_same_instance() { + final CaffeineCachingParser objectUnderTest = createObjectUnderTest(); + + final String userAgentString = KNOWN_USER_AGENT_STRING; + final Device device = objectUnderTest.parseDevice(userAgentString); + + assertThat(device, notNullValue()); + + assertThat(objectUnderTest.parseDevice(new String(userAgentString)), sameInstance(device)); + assertThat(objectUnderTest.parseDevice(new String(userAgentString)), sameInstance(device)); + assertThat(objectUnderTest.parseDevice(new String(userAgentString)), sameInstance(device)); + } + + @Test + void parseOS_returns_expected_results() { + final OS os = createObjectUnderTest().parseOS(KNOWN_USER_AGENT_STRING); + + assertThat(os, notNullValue()); + assertThat(os.family, equalTo("iOS")); + assertThat(os.major, equalTo("13")); + } + + @Test + void parseOS_with_null_returns_null() { + assertThat(createObjectUnderTest().parseOS(null), + nullValue()); + } + + @Test + void parseOS_called_multiple_times_returns_same_instance() { + final CaffeineCachingParser objectUnderTest = createObjectUnderTest(); + + final String userAgentString = KNOWN_USER_AGENT_STRING; + final OS os = objectUnderTest.parseOS(userAgentString); + + assertThat(os, notNullValue()); + + assertThat(objectUnderTest.parseOS(new String(userAgentString)), sameInstance(os)); + assertThat(objectUnderTest.parseOS(new String(userAgentString)), sameInstance(os)); + assertThat(objectUnderTest.parseOS(new String(userAgentString)), sameInstance(os)); + } +} \ No newline at end of file diff --git a/data-prepper-plugins/user-agent-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/useragent/UserAgentProcessorTest.java b/data-prepper-plugins/user-agent-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/useragent/UserAgentProcessorTest.java index da0923f509..a346218d0a 100644 --- a/data-prepper-plugins/user-agent-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/useragent/UserAgentProcessorTest.java +++ b/data-prepper-plugins/user-agent-processor/src/test/java/org/opensearch/dataprepper/plugins/processor/useragent/UserAgentProcessorTest.java @@ -12,8 +12,10 @@ import org.junit.jupiter.params.provider.MethodSource; import org.mockito.Mock; import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.dataprepper.event.TestEventKeyFactory; import org.opensearch.dataprepper.metrics.PluginMetrics; import org.opensearch.dataprepper.model.event.Event; +import org.opensearch.dataprepper.model.event.EventKeyFactory; import org.opensearch.dataprepper.model.event.JacksonEvent; import org.opensearch.dataprepper.model.record.Record; @@ -38,11 +40,13 @@ class UserAgentProcessorTest { @Mock private UserAgentProcessorConfig mockConfig; + private final EventKeyFactory eventKeyFactory = TestEventKeyFactory.getTestEventFactory(); + @ParameterizedTest @MethodSource("userAgentStringArguments") public void testParsingUserAgentStrings( String uaString, String uaName, String uaVersion, String osName, String osVersion, String osFull, String deviceName) { - when(mockConfig.getSource()).thenReturn("source"); + when(mockConfig.getSource()).thenReturn(eventKeyFactory.createEventKey("source")); when(mockConfig.getTarget()).thenReturn("user_agent"); when(mockConfig.getCacheSize()).thenReturn(TEST_CACHE_SIZE); @@ -64,7 +68,7 @@ public void testParsingUserAgentStrings( @MethodSource("userAgentStringArguments") public void testParsingUserAgentStringsWithCustomTarget( String uaString, String uaName, String uaVersion, String osName, String osVersion, String osFull, String deviceName) { - when(mockConfig.getSource()).thenReturn("source"); + when(mockConfig.getSource()).thenReturn(eventKeyFactory.createEventKey("source")); when(mockConfig.getTarget()).thenReturn("my_target"); when(mockConfig.getCacheSize()).thenReturn(TEST_CACHE_SIZE); @@ -86,7 +90,7 @@ public void testParsingUserAgentStringsWithCustomTarget( @MethodSource("userAgentStringArguments") public void testParsingUserAgentStringsExcludeOriginal( String uaString, String uaName, String uaVersion, String osName, String osVersion, String osFull, String deviceName) { - when(mockConfig.getSource()).thenReturn("source"); + when(mockConfig.getSource()).thenReturn(eventKeyFactory.createEventKey("source")); when(mockConfig.getTarget()).thenReturn("user_agent"); when(mockConfig.getExcludeOriginal()).thenReturn(true); when(mockConfig.getCacheSize()).thenReturn(TEST_CACHE_SIZE); @@ -107,8 +111,9 @@ public void testParsingUserAgentStringsExcludeOriginal( @Test public void testParsingWhenUserAgentStringNotExist() { - when(mockConfig.getSource()).thenReturn("bad_source"); + when(mockConfig.getSource()).thenReturn(eventKeyFactory.createEventKey("bad_source")); when(mockConfig.getCacheSize()).thenReturn(TEST_CACHE_SIZE); + when(mockConfig.getTarget()).thenReturn("user_agent"); final UserAgentProcessor processor = createObjectUnderTest(); final Record testRecord = createTestRecord(UUID.randomUUID().toString()); @@ -120,8 +125,9 @@ public void testParsingWhenUserAgentStringNotExist() { @Test public void testTagsAddedOnParseFailure() { - when(mockConfig.getSource()).thenReturn("bad_source"); + when(mockConfig.getSource()).thenReturn(eventKeyFactory.createEventKey("bad_source")); when(mockConfig.getCacheSize()).thenReturn(TEST_CACHE_SIZE); + when(mockConfig.getTarget()).thenReturn("user_agent"); final String tagOnFailure1 = UUID.randomUUID().toString(); final String tagOnFailure2 = UUID.randomUUID().toString(); @@ -138,7 +144,7 @@ public void testTagsAddedOnParseFailure() { } private UserAgentProcessor createObjectUnderTest() { - return new UserAgentProcessor(pluginMetrics, mockConfig); + return new UserAgentProcessor(mockConfig, eventKeyFactory, pluginMetrics); } private Record createTestRecord(String uaString) { diff --git a/data-prepper-test-event/src/main/java/org/opensearch/dataprepper/event/TestEventContext.java b/data-prepper-test-event/src/main/java/org/opensearch/dataprepper/event/TestEventContext.java new file mode 100644 index 0000000000..6c5b001129 --- /dev/null +++ b/data-prepper-test-event/src/main/java/org/opensearch/dataprepper/event/TestEventContext.java @@ -0,0 +1,24 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.event; + +import org.opensearch.dataprepper.core.event.EventFactoryApplicationContextMarker; +import org.springframework.context.annotation.AnnotationConfigApplicationContext; + +class TestEventContext { + private static AnnotationConfigApplicationContext APPLICATION_CONTEXT; + + private TestEventContext() {} + + static T getFromContext(final Class targetClass) { + if(APPLICATION_CONTEXT == null) { + APPLICATION_CONTEXT = new AnnotationConfigApplicationContext(); + APPLICATION_CONTEXT.scan(EventFactoryApplicationContextMarker.class.getPackageName()); + APPLICATION_CONTEXT.refresh(); + } + return APPLICATION_CONTEXT.getBean(targetClass); + } +} diff --git a/data-prepper-test-event/src/main/java/org/opensearch/dataprepper/event/TestEventFactory.java b/data-prepper-test-event/src/main/java/org/opensearch/dataprepper/event/TestEventFactory.java index 932c9ca66a..08a2cd2f29 100644 --- a/data-prepper-test-event/src/main/java/org/opensearch/dataprepper/event/TestEventFactory.java +++ b/data-prepper-test-event/src/main/java/org/opensearch/dataprepper/event/TestEventFactory.java @@ -5,18 +5,15 @@ package org.opensearch.dataprepper.event; -import org.opensearch.dataprepper.core.event.EventFactoryApplicationContextMarker; import org.opensearch.dataprepper.model.event.BaseEventBuilder; import org.opensearch.dataprepper.model.event.Event; import org.opensearch.dataprepper.model.event.EventFactory; -import org.springframework.context.annotation.AnnotationConfigApplicationContext; /** * An implementation of {@link EventFactory} that is useful for integration and unit tests * in other projects. */ public class TestEventFactory implements EventFactory { - private static AnnotationConfigApplicationContext APPLICATION_CONTEXT; private static EventFactory DEFAULT_EVENT_FACTORY; private final EventFactory innerEventFactory; @@ -25,11 +22,8 @@ public class TestEventFactory implements EventFactory { } public static EventFactory getTestEventFactory() { - if(APPLICATION_CONTEXT == null) { - APPLICATION_CONTEXT = new AnnotationConfigApplicationContext(); - APPLICATION_CONTEXT.scan(EventFactoryApplicationContextMarker.class.getPackageName()); - APPLICATION_CONTEXT.refresh(); - DEFAULT_EVENT_FACTORY = APPLICATION_CONTEXT.getBean(EventFactory.class); + if(DEFAULT_EVENT_FACTORY == null) { + DEFAULT_EVENT_FACTORY = TestEventContext.getFromContext(EventFactory.class); } return new TestEventFactory(DEFAULT_EVENT_FACTORY); } diff --git a/data-prepper-test-event/src/main/java/org/opensearch/dataprepper/event/TestEventKeyFactory.java b/data-prepper-test-event/src/main/java/org/opensearch/dataprepper/event/TestEventKeyFactory.java new file mode 100644 index 0000000000..0cec742924 --- /dev/null +++ b/data-prepper-test-event/src/main/java/org/opensearch/dataprepper/event/TestEventKeyFactory.java @@ -0,0 +1,30 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.event; + +import org.opensearch.dataprepper.model.event.EventKey; +import org.opensearch.dataprepper.model.event.EventKeyFactory; + +public class TestEventKeyFactory implements EventKeyFactory { + private static EventKeyFactory DEFAULT_EVENT_KEY_FACTORY; + private final EventKeyFactory innerEventKeyFactory; + + TestEventKeyFactory(final EventKeyFactory innerEventKeyFactory) { + this.innerEventKeyFactory = innerEventKeyFactory; + } + + public static EventKeyFactory getTestEventFactory() { + if(DEFAULT_EVENT_KEY_FACTORY == null) { + DEFAULT_EVENT_KEY_FACTORY = TestEventContext.getFromContext(EventKeyFactory.class); + } + return new TestEventKeyFactory(DEFAULT_EVENT_KEY_FACTORY); + } + + @Override + public EventKey createEventKey(final String key, final EventAction... forActions) { + return innerEventKeyFactory.createEventKey(key, forActions); + } +} diff --git a/data-prepper-test-event/src/test/java/org/opensearch/dataprepper/event/TestEventKeyFactoryTest.java b/data-prepper-test-event/src/test/java/org/opensearch/dataprepper/event/TestEventKeyFactoryTest.java new file mode 100644 index 0000000000..65b17819b8 --- /dev/null +++ b/data-prepper-test-event/src/test/java/org/opensearch/dataprepper/event/TestEventKeyFactoryTest.java @@ -0,0 +1,56 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.event; + +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.junit.jupiter.params.ParameterizedTest; +import org.junit.jupiter.params.provider.EnumSource; +import org.mockito.Mock; +import org.mockito.junit.jupiter.MockitoExtension; +import org.opensearch.dataprepper.model.event.EventKey; +import org.opensearch.dataprepper.model.event.EventKeyFactory; + +import java.util.UUID; + +import static org.hamcrest.MatcherAssert.assertThat; +import static org.hamcrest.Matchers.equalTo; +import static org.mockito.Mockito.when; + +@ExtendWith(MockitoExtension.class) +class TestEventKeyFactoryTest { + + @Mock + private EventKeyFactory innerEventKeyFactory; + + @Mock + private EventKey eventKey; + + private TestEventKeyFactory createObjectUnderTest() { + return new TestEventKeyFactory(innerEventKeyFactory); + } + + @Test + void createEventKey_returns_from_inner_EventKeyFactory() { + final String keyPath = UUID.randomUUID().toString(); + when(innerEventKeyFactory.createEventKey(keyPath, EventKeyFactory.EventAction.ALL)) + .thenReturn(eventKey); + + assertThat(createObjectUnderTest().createEventKey(keyPath), + equalTo(eventKey)); + } + + @ParameterizedTest + @EnumSource(EventKeyFactory.EventAction.class) + void createEventKey_with_Actions_returns_from_inner_EventKeyFactory(final EventKeyFactory.EventAction eventAction) { + final String keyPath = UUID.randomUUID().toString(); + when(innerEventKeyFactory.createEventKey(keyPath, eventAction)) + .thenReturn(eventKey); + + assertThat(createObjectUnderTest().createEventKey(keyPath, eventAction), + equalTo(eventKey)); + } +} \ No newline at end of file diff --git a/examples/trace-analytics-sample-app/sample-app/requirements.txt b/examples/trace-analytics-sample-app/sample-app/requirements.txt index 3f7f8b5f1d..a24bef87af 100644 --- a/examples/trace-analytics-sample-app/sample-app/requirements.txt +++ b/examples/trace-analytics-sample-app/sample-app/requirements.txt @@ -1,10 +1,10 @@ dash==2.15.0 mysql-connector==2.2.9 -opentelemetry-exporter-otlp==1.20.0 -opentelemetry-instrumentation-flask==0.41b0 -opentelemetry-instrumentation-mysql==0.41b0 -opentelemetry-instrumentation-requests==0.41b0 -opentelemetry-sdk==1.20.0 +opentelemetry-exporter-otlp==1.25.0 +opentelemetry-instrumentation-flask==0.46b0 +opentelemetry-instrumentation-mysql==0.46b0 +opentelemetry-instrumentation-requests==0.46b0 +opentelemetry-sdk==1.25.0 protobuf==3.20.3 -urllib3==2.0.7 +urllib3==2.2.2 werkzeug==3.0.3 \ No newline at end of file diff --git a/gradle/wrapper/gradle-wrapper.properties b/gradle/wrapper/gradle-wrapper.properties index b82aa23a4f..a4413138c9 100644 --- a/gradle/wrapper/gradle-wrapper.properties +++ b/gradle/wrapper/gradle-wrapper.properties @@ -1,6 +1,6 @@ distributionBase=GRADLE_USER_HOME distributionPath=wrapper/dists -distributionUrl=https\://services.gradle.org/distributions/gradle-8.7-bin.zip +distributionUrl=https\://services.gradle.org/distributions/gradle-8.8-bin.zip networkTimeout=10000 validateDistributionUrl=true zipStoreBase=GRADLE_USER_HOME diff --git a/gradlew b/gradlew index 1aa94a4269..b740cf1339 100755 --- a/gradlew +++ b/gradlew @@ -55,7 +55,7 @@ # Darwin, MinGW, and NonStop. # # (3) This script is generated from the Groovy template -# https://github.com/gradle/gradle/blob/HEAD/subprojects/plugins/src/main/resources/org/gradle/api/internal/plugins/unixStartScript.txt +# https://github.com/gradle/gradle/blob/HEAD/platforms/jvm/plugins-application/src/main/resources/org/gradle/api/internal/plugins/unixStartScript.txt # within the Gradle project. # # You can find Gradle at https://github.com/gradle/gradle/. diff --git a/performance-test/build.gradle b/performance-test/build.gradle index 8c4a9693d2..6dd99cb08d 100644 --- a/performance-test/build.gradle +++ b/performance-test/build.gradle @@ -15,7 +15,7 @@ configurations.all { group 'org.opensearch.dataprepper.test.performance' dependencies { - gatlingImplementation 'software.amazon.awssdk:auth:2.25.21' + gatlingImplementation 'software.amazon.awssdk:auth:2.26.12' implementation 'com.fasterxml.jackson.core:jackson-core' testRuntimeOnly testLibs.junit.engine diff --git a/performance-test/src/gatling/java/org/opensearch/dataprepper/test/performance/tools/Templates.java b/performance-test/src/gatling/java/org/opensearch/dataprepper/test/performance/tools/Templates.java index 27df8a1848..67347ff393 100644 --- a/performance-test/src/gatling/java/org/opensearch/dataprepper/test/performance/tools/Templates.java +++ b/performance-test/src/gatling/java/org/opensearch/dataprepper/test/performance/tools/Templates.java @@ -7,12 +7,14 @@ import io.gatling.javaapi.core.Session; import org.opensearch.dataprepper.test.data.generation.IpAddress; +import org.opensearch.dataprepper.test.data.generation.UserAgent; import java.time.LocalDateTime; import java.time.format.DateTimeFormatter; import java.util.List; import java.util.Random; import java.util.function.Function; +import java.util.function.Supplier; import java.util.stream.Collectors; import java.util.stream.IntStream; @@ -30,19 +32,28 @@ public static String now() { } public static Function apacheCommonLogTemplate(final int batchSize) { - return session -> { - final List logs = IntStream.range(0, batchSize) - .mapToObj(i -> "{\"log\": \"" + ipAddress() + " - frank [" + now() + "] \\\"" + httpMethod() + " /apache_pb.gif HTTP/1.0\\\" "+ statusCode() + " " + responseSize() + "\"}") - .collect(Collectors.toList()); - final String logArray = String.join(",", logs); - return "[" + logArray + "]"; - }; + return generateLogArray(batchSize, + () -> ipAddress() + " - frank [" + now() + "] \\\"" + httpMethod() + " /apache_pb.gif HTTP/1.0\\\" "+ statusCode() + " " + responseSize()); + } + + public static Function userAgent(final int batchSize) { + return generateLogArray(batchSize, () -> userAgent()); + } + + private static Function generateLogArray(final int batchSize, final Supplier stringSupplier) { + return session -> IntStream.range(0, batchSize) + .mapToObj(i -> "{\"log\": \"" + stringSupplier.get() + "\"}") + .collect(Collectors.joining(",", "[", "]")); } private static String ipAddress() { return IpAddress.getInstance().ipAddress(); } + private static String userAgent() { + return UserAgent.getInstance().userAgent(); + } + private static String httpMethod() { return randomFromList(HTTP_METHODS); } diff --git a/performance-test/src/main/java/org/opensearch/dataprepper/test/data/generation/UserAgent.java b/performance-test/src/main/java/org/opensearch/dataprepper/test/data/generation/UserAgent.java new file mode 100644 index 0000000000..ced3b090c4 --- /dev/null +++ b/performance-test/src/main/java/org/opensearch/dataprepper/test/data/generation/UserAgent.java @@ -0,0 +1,66 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.test.data.generation; + +import java.util.Random; +import java.util.UUID; + +public class UserAgent { + private static final UserAgent USER_AGENT = new UserAgent(); + private final Random random; + + private UserAgent() { + random = new Random(); + } + + public static UserAgent getInstance() { + return USER_AGENT; + } + + public String userAgent() { + final StringBuilder userAgentBuilder = new StringBuilder(); + + buildBrowserPart(userAgentBuilder); + userAgentBuilder.append(" ("); + + buildDevicePart(userAgentBuilder); + userAgentBuilder.append(" "); + + buildOsPart(userAgentBuilder); + userAgentBuilder.append(")"); + + return userAgentBuilder.toString(); + } + + private void buildOsPart(final StringBuilder userAgentBuilder) { + userAgentBuilder.append(randomString()); + userAgentBuilder.append(" "); + buildVersionString(userAgentBuilder); + } + + private void buildDevicePart(final StringBuilder userAgentBuilder) { + userAgentBuilder.append(randomString()); + userAgentBuilder.append(";"); + } + + private void buildBrowserPart(final StringBuilder userAgentBuilder) { + userAgentBuilder.append(randomString()); + userAgentBuilder.append("/"); + buildVersionString(userAgentBuilder); + } + + private void buildVersionString(final StringBuilder userAgentBuilder) { + userAgentBuilder.append(random.nextInt(9) + 1); + userAgentBuilder.append("."); + userAgentBuilder.append(random.nextInt(30)); + userAgentBuilder.append("."); + userAgentBuilder.append(random.nextInt(30)); + } + + private static String randomString() { + return UUID.randomUUID().toString().replaceAll("-", ""); + } +} diff --git a/performance-test/src/test/java/org/opensearch/dataprepper/test/data/generation/UserAgentTest.java b/performance-test/src/test/java/org/opensearch/dataprepper/test/data/generation/UserAgentTest.java new file mode 100644 index 0000000000..0967c3c3e4 --- /dev/null +++ b/performance-test/src/test/java/org/opensearch/dataprepper/test/data/generation/UserAgentTest.java @@ -0,0 +1,40 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.dataprepper.test.data.generation; + +import org.junit.jupiter.api.Test; + +import static org.hamcrest.MatcherAssert.assertThat; +import static org.hamcrest.Matchers.equalTo; +import static org.hamcrest.Matchers.greaterThanOrEqualTo; +import static org.hamcrest.Matchers.matchesPattern; +import static org.hamcrest.Matchers.not; +import static org.hamcrest.Matchers.notNullValue; + +class UserAgentTest { + @Test + void userAgent_returns_string() { + final String userAgent = UserAgent.getInstance().userAgent(); + + assertThat(userAgent, notNullValue()); + assertThat(userAgent.length(), greaterThanOrEqualTo(10)); + + String expectedRegex = "^[A-Za-z0-9]+/[0-9]+.[0-9]+.[0-9]+ \\([A-Za-z0-9]+; [A-Za-z0-9]+ [0-9]+.[0-9]+.[0-9]+\\)$"; + + assertThat(userAgent, matchesPattern(expectedRegex)); + } + + @Test + void userAgent_returns_unique_value_on_multiple_calls() { + final UserAgent objectUnderTest = UserAgent.getInstance(); + final String userAgent = objectUnderTest.userAgent(); + + assertThat(userAgent, notNullValue()); + assertThat(objectUnderTest.userAgent(), not(equalTo(userAgent))); + assertThat(objectUnderTest.userAgent(), not(equalTo(userAgent))); + assertThat(objectUnderTest.userAgent(), not(equalTo(userAgent))); + } +} \ No newline at end of file diff --git a/release/smoke-tests/otel-span-exporter/requirements.txt b/release/smoke-tests/otel-span-exporter/requirements.txt index 3cd451baf4..f2e5b97c35 100644 --- a/release/smoke-tests/otel-span-exporter/requirements.txt +++ b/release/smoke-tests/otel-span-exporter/requirements.txt @@ -1,19 +1,19 @@ backoff==1.10.0 -certifi==2023.7.22 +certifi==2024.07.04 charset-normalizer==2.0.9 Deprecated==1.2.13 googleapis-common-protos==1.53.0 grpcio==1.53.2 -idna==3.3 -opentelemetry-api==1.7.1 -opentelemetry-exporter-otlp==1.7.1 -opentelemetry-exporter-otlp-proto-grpc==1.7.1 -opentelemetry-exporter-otlp-proto-http==1.7.1 -opentelemetry-proto==1.7.1 -opentelemetry-sdk==1.7.1 -opentelemetry-semantic-conventions==0.26b1 +idna==3.7 +opentelemetry-api==1.25.0 +opentelemetry-exporter-otlp==1.25.0 +opentelemetry-exporter-otlp-proto-grpc==1.25.0 +opentelemetry-exporter-otlp-proto-http==1.25.0 +opentelemetry-proto==1.25.0 +opentelemetry-sdk==1.25.0 +opentelemetry-semantic-conventions==0.46b0 protobuf==3.19.5 requests==2.32.3 six==1.16.0 -urllib3==1.26.18 +urllib3==1.26.19 wrapt==1.13.3 diff --git a/release/staging-resources-cdk/package-lock.json b/release/staging-resources-cdk/package-lock.json index 7ac1eaed21..32da99d8c9 100644 --- a/release/staging-resources-cdk/package-lock.json +++ b/release/staging-resources-cdk/package-lock.json @@ -7720,9 +7720,9 @@ } }, "node_modules/ws": { - "version": "7.5.9", - "resolved": "https://registry.npmjs.org/ws/-/ws-7.5.9.tgz", - "integrity": "sha512-F+P9Jil7UiSKSkppIiD94dN07AwvFixvLIj1Og1Rl9GGMuNipJnV9JzjD6XuqmAeiswGvUmNLjr5cFuXwNS77Q==", + "version": "7.5.10", + "resolved": "https://registry.npmjs.org/ws/-/ws-7.5.10.tgz", + "integrity": "sha512-+dbF1tHwZpXcbOJdVOkzLDxZP1ailvSxM6ZweXTegylPny803bFhA+vqBYw4s31NSAk4S2Qz+AKXK9a4wkdjcQ==", "dev": true, "engines": { "node": ">=8.3.0" @@ -13755,9 +13755,9 @@ } }, "ws": { - "version": "7.5.9", - "resolved": "https://registry.npmjs.org/ws/-/ws-7.5.9.tgz", - "integrity": "sha512-F+P9Jil7UiSKSkppIiD94dN07AwvFixvLIj1Og1Rl9GGMuNipJnV9JzjD6XuqmAeiswGvUmNLjr5cFuXwNS77Q==", + "version": "7.5.10", + "resolved": "https://registry.npmjs.org/ws/-/ws-7.5.10.tgz", + "integrity": "sha512-+dbF1tHwZpXcbOJdVOkzLDxZP1ailvSxM6ZweXTegylPny803bFhA+vqBYw4s31NSAk4S2Qz+AKXK9a4wkdjcQ==", "dev": true, "requires": {} }, diff --git a/settings.gradle b/settings.gradle index 0390956974..e5e1c2b98a 100644 --- a/settings.gradle +++ b/settings.gradle @@ -30,7 +30,7 @@ dependencyResolutionManagement { libs { version('slf4j', '2.0.6') library('slf4j-api', 'org.slf4j', 'slf4j-api').versionRef('slf4j') - version('armeria', '1.28.2') + version('armeria', '1.29.0') library('armeria-core', 'com.linecorp.armeria', 'armeria').versionRef('armeria') library('armeria-grpc', 'com.linecorp.armeria', 'armeria-grpc').versionRef('armeria') library('armeria-junit', 'com.linecorp.armeria', 'armeria-junit5').versionRef('armeria') @@ -60,20 +60,25 @@ dependencyResolutionManagement { library('commons-io', 'commons-io', 'commons-io').version('2.15.1') library('commons-codec', 'commons-codec', 'commons-codec').version('1.16.0') library('commons-compress', 'org.apache.commons', 'commons-compress').version('1.24.0') - version('hadoop', '3.3.6') + version('parquet', '1.14.1') + library('parquet-common', 'org.apache.parquet', 'parquet-common').versionRef('parquet') + library('parquet-avro', 'org.apache.parquet', 'parquet-avro').versionRef('parquet') + library('parquet-column', 'org.apache.parquet', 'parquet-column').versionRef('parquet') + library('parquet-hadoop', 'org.apache.parquet', 'parquet-hadoop').versionRef('parquet') + version('hadoop', '3.4.0') library('hadoop-common', 'org.apache.hadoop', 'hadoop-common').versionRef('hadoop') library('hadoop-mapreduce', 'org.apache.hadoop', 'hadoop-mapreduce-client-core').versionRef('hadoop') version('avro', '1.11.3') library('avro-core', 'org.apache.avro', 'avro').versionRef('avro') + library('caffeine', 'com.github.ben-manes.caffeine', 'caffeine').version('3.1.8') } testLibs { version('junit', '5.8.2') - version('mockito', '3.11.2') + version('mockito', '5.12.0') version('hamcrest', '2.2') version('awaitility', '4.2.0') version('spring', '5.3.28') version('slf4j', '2.0.6') - version('hadoop', '3.3.6') library('junit-core', 'org.junit.jupiter', 'junit-jupiter').versionRef('junit') library('junit-params', 'org.junit.jupiter', 'junit-jupiter-params').versionRef('junit') library('junit-engine', 'org.junit.jupiter', 'junit-jupiter-engine').versionRef('junit') @@ -87,7 +92,6 @@ dependencyResolutionManagement { library('awaitility', 'org.awaitility', 'awaitility').versionRef('awaitility') library('spring-test', 'org.springframework', 'spring-test').versionRef('spring') library('slf4j-simple', 'org.slf4j', 'slf4j-simple').versionRef('slf4j') - library('hadoop-common', 'org.apache.hadoop', 'hadoop-common').versionRef('hadoop') } } } @@ -174,5 +178,5 @@ include 'data-prepper-plugins:mongodb' include 'data-prepper-plugins:rds-source' include 'data-prepper-plugins:http-source-common' include 'data-prepper-plugins:http-common' -include 'data-prepper-plugins:lambda-sink' -include 'data-prepper-plugins:opensearch-api-source' \ No newline at end of file +include 'data-prepper-plugins:opensearch-api-source' +include 'data-prepper-plugins:lambda' \ No newline at end of file diff --git a/testing/aws-testing-cdk/package-lock.json b/testing/aws-testing-cdk/package-lock.json index fbb7310d4f..c7ae43fe77 100644 --- a/testing/aws-testing-cdk/package-lock.json +++ b/testing/aws-testing-cdk/package-lock.json @@ -2310,12 +2310,12 @@ } }, "node_modules/braces": { - "version": "3.0.2", - "resolved": "https://registry.npmjs.org/braces/-/braces-3.0.2.tgz", - "integrity": "sha512-b8um+L1RzM3WDSzvhm6gIz1yfTbBt6YTlcEKAvsmqCZZFw46z626lVj9j1yEPW33H5H+lBQpZMP1k8l+78Ha0A==", + "version": "3.0.3", + "resolved": "https://registry.npmjs.org/braces/-/braces-3.0.3.tgz", + "integrity": "sha512-yQbXgO/OSZVD2IsiLlro+7Hf6Q18EJrKSEsdoMzKePKXct3gvD8oLcOQdIzGupr5Fj+EDe8gO/lxc1BzfMpxvA==", "dev": true, "dependencies": { - "fill-range": "^7.0.1" + "fill-range": "^7.1.1" }, "engines": { "node": ">=8" @@ -3102,9 +3102,9 @@ } }, "node_modules/fill-range": { - "version": "7.0.1", - "resolved": "https://registry.npmjs.org/fill-range/-/fill-range-7.0.1.tgz", - "integrity": "sha512-qOo9F+dMUmC2Lcb4BbVvnKJxTPjCm+RRpe4gDuGrzkL7mEVl/djYSu2OdQ2Pa302N4oqkSg9ir6jaLWJ2USVpQ==", + "version": "7.1.1", + "resolved": "https://registry.npmjs.org/fill-range/-/fill-range-7.1.1.tgz", + "integrity": "sha512-YsGpe3WHLK8ZYi4tWDg2Jy3ebRz2rXowDxnld4bkQB00cc/1Zw9AWnC0i9ztDJitivtQvaI9KaLyKrc+hBW0yg==", "dev": true, "dependencies": { "to-regex-range": "^5.0.1"