Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Logstash [v8.3.3] adding "event" field to payload. #193

Open
karun-singh opened this issue Sep 28, 2022 · 0 comments
Open

Logstash [v8.3.3] adding "event" field to payload. #193

karun-singh opened this issue Sep 28, 2022 · 0 comments

Comments

@karun-singh
Copy link
Collaborator

Logstash pipeline is adding an event field to the payload containing the original payload from RabbitMQ. This is causing Elasticsearch to throw an error as it goes against the strict mapping set for the indices.

Tested using the RMQ test script here

Error from Logstash:

{
	"level": "WARN",
	"loggerName": "logstash.outputs.elasticsearch",
	"timeMillis": 1664274188010,
	"thread": "[resource-group]>worker0",
	"logEvent": {
		"message": "Could not index event to Elasticsearch.",
		"status": 400,
		"action": ["index", {
			"_index": "test-itms"
		}, {
			"id": "test-itms",
			"last_stop_arrival_time": "15:09:58",
			"trip_delay": 948,
			"actual_trip_start_time": "2020-11-03T14:22:30+05:30",
			"license_plate": "10",
			"last_stop_id": "2028",
			"event": {
				"original": "{\"trip_direction\": \"NT\", \"trip_id\": \"24374871\", \"route_id\": \"17AD\", \"trip_delay\": 948, \"last_stop_arrival_time\": \"15:09:58\", \"actual_trip_start_time\": \"2020-11-03T14:22:30+05:30\", \"vehicle_label\": \"A09\", \"observationDateTime\": \"2022-09-27T10:23:07+00:00\", \"speed\": 25.0, \"license_plate\": \"10\", \"last_stop_id\": \"2028\", \"location\": {\"coordinates\": [72.870511, 21.218943], \"type\": \"Point\"}, \"id\": \"test-itms\"}"
			},
			"trip_id": "24374871",
			"vehicle_label": "A09",
			"route_id": "17AD",
			"observationDateTime": "2022-09-27T10:23:07+00:00",
			"location": {
				"coordinates": [72.870511, 21.218943],
				"type": "Point"
			},
			"speed": 25.0,
			"trip_direction": "NT"
		}],
		"response": {
			"index": {
				"_index": "test-itms",
				"_id": "BMN4foMBfBEPN2SVxrea",
				"status": 400,
				"error": {
					"type": "strict_dynamic_mapping_exception",
					"reason": "mapping set to strict, dynamic introduction of [event] within [_doc] is not allowed"
				}
			}
		}
	}
}

Current solution:

This was temporarily fixed in commit- 8ef8f5a by manually removing the event field in the pipeline filter. Need to debug what's causing this and make a permanent fix.

abhi4578 added a commit that referenced this issue Apr 9, 2023
- postgres max connections increased as more no. of components use it
- Logstash fix pipeline, by removing event field in docker
  see issue #193
abhi4578 added a commit that referenced this issue Apr 9, 2023
- postgres max connections increased as more no. of components use it
- Logstash fix pipeline, by removing event field in docker
  see issue #193
pranavv0 pushed a commit to pranavv0/iudx-deployment that referenced this issue May 1, 2023
- postgres max connections increased as more no. of components use it
- Logstash fix pipeline, by removing event field in docker
  see issue datakaveri#193
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant