swiftpm use binaryTarget got an error 'no such module' when archive - swift-package-manager

I'm trying to refactor the project using swiftpm and everything works fine, both in the emulator and on my iPhone device. But when I archive the project, I get an error 'no such module 'SFS2XAPIIOS''.
Here's the code of my Package.swift:
// swift-tools-version:5.3
// The swift-tools-version declares the minimum version of Swift required to build this package.
import PackageDescription
let package = Package(
name: "BaseIM",
platforms: [
.iOS(.v11)
],
products: [
.library(name: "BaseIM", targets: ["BaseIM"]),
.library(name: "SFS2XAPIIOSX", targets: ["SFS2XAPIIOS"])
],
dependencies: [
.package(name: "BaseTools", url: "http://192.168.1.28:8888/kevin/basetools.git", .branch("master")),
.package(name: "BaseClass", url: "http://192.168.1.28:8888/kevin/baseclass.git", .branch("master")),
.package(name: "MediaKit", url: "http://192.168.1.28:8888/kevin/mediakit.git", .branch("master")),
.package(name: "Realm", url: "https://github.com/realm/realm-cocoa", .upToNextMajor(from: "10.1.4"))
],
targets: [
.target(
name: "BaseIM",
dependencies: [
"SFS2XAPIIOSX", "BaseTools", "BaseClass", "MediaKit",
.product(name: "RealmSwift", package: "Realm")
]
),
.target(
name: "SFS2XAPIIOSX",
dependencies: [
"SFS2XAPIIOS"
],
path: "SFS2XAPIIOS",
cSettings: [
.headerSearchPath("Header.h")
]
),
.binaryTarget(name: "SFS2XAPIIOS", path: "SFS2XAPIIOS/SFS2XAPIIOS.xcframework"),
.testTarget(
name: "BaseIMTests",
dependencies: ["BaseIM"]),
]
)

Related

add sdk c++ headers into swift package manager project

I have a c++ project that i want to add to a swift package manager project
the c++ project references headers such as #include <string> this header resides in
iossdk/usr/include/c++/v1
how do i get the swift package manager to include those headers ?
let package = Package(
name: "LibProject",
platforms: [.iOS(.v13)],
products: [
.library(
name: "LibProject",
targets: ["LibModule1", "LibModule2Framework"]),
],
dependencies: [
],
targets: [
.target(
name: "LibModule1",
path: "Sources/LibModule1"),
.target(
name: "LibModule2Framework",
path: "Sources/LibModule2Framework",
publicHeadersPath: ".",
cxxSettings: [
.headerSearchPath("usr/include/c++/v1"),
]
),
.testTarget(
name: "LibModuleTests",
dependencies: ["LibModuleTests"]),
],
cLanguageStandard: .c17,
cxxLanguageStandard: .gnucxx17
)```

Swift Package Manager: commit in project's sub-dependency doesn't appear in package code

I have a project which uses a dependency (Lib), which contains a sub-dependency (Utilities). I have just updated this sub-dependency, adding some code, but I can't see it from my project.
Here is the dependency declared in my project:
The Package in Lib, with the dependency to Utilities:
// swift-tools-version: 5.6
import PackageDescription
let package = Package(
name: "Lib",
defaultLocalization: "en",
platforms: [
.iOS("12.1")
],
products: [
.library(name: "Lib",
targets: ["Lib"])
],
dependencies: [
.package(url: "git#bitbucket.org:__UTILITIES__.git", branch: "development")
],
targets: [
.target(name: "Lib",
dependencies: [
.product(name: "Utilities",
package: "__UTILITIES__")
],
path: "code"
]
)
__UTILITIES__ is the utilities repo
I have committed and pushed the new code in Utilities's repo, on development branch. However, whatever I try (either resetting package caches, or updating to latest package versions), I never actually get my latest code from my project.
Am I missing something?
Thank you for your help

Swift Package Manager - binaryTarget with .zip file fails to validate

I tried different approaches to add a binaryTarget to a Swift package - 2 of them worked out fine (Target1 and Target2 in the example), but third approach (Target3) that should also work according to documentation does not validate: unsupported extension for binary target 'Target3'; valid extensions are: xcframework
For not bloating the repo too much with every binary release I would prefer the zip approach here... - Anyone got it working with a binaryTarget and a .zip file in path: added to the Package repository, or any hints what I'm doing wrong here?
(Xcode 12.4, t3.zip containing only the .xcframework at root level)
// swift-tools-version:5.3
// The swift-tools-version declares the minimum version of Swift required to build this package.
import PackageDescription
let package = Package(
name: "StackoverflowExamplePackage",
platforms: [
.iOS(.v9)
],
products: [
.library(
name: "Lib1",
targets: ["Target1"]),
.library(
name: "Lib2",
targets: ["Target2", "Target3"]),
],
dependencies: [
// .package(url: /* package url */, from: "1.0.0"),
],
targets: [
.binaryTarget(
name: "Target1",
url: "https://myurl.example.com/t1-xcframework.zip",
checksum: "777ddd6381e2201b7eb778b72f373f77e1190fd9dc9503f703e37c86d3b89674"
),
.binaryTarget(name: "Target2", path: "./Binaries/t2.xcframework"),
.binaryTarget(name: "Target3", path: "./Binaries/t3.zip"),
]
)
Zip archive support for local binary targets in SPM was merged last year in October and has been finally released along with Xcode 13.3.

Custom Script Extension not working on Red Hat 7.2

I am unable to get Custom Script Extension working on Red Hat 7.2. I tried the latest extension and have the following in my ARM template -
{
"name": "[concat(parameters('VMNamePrefix'), parameters('startingNumeral')[copyindex()],'/',parameters('VMNamePrefix'), parameters('startingNumeral')[copyindex()],'-CUSTOMSCRIPT')]",
"type": "Microsoft.Compute/virtualMachines/extensions",
"location": "[parameters('region')]",
"apiVersion": "[variables('apiVersionVirtualMachines')]",
"tags": {
"ApmID": "[parameters('apmID')]",
"ApplicationName": "[parameters('applicationName')]",
"SharedService": "[parameters('sharedService')]",
"PaaSOnly": "[parameters('paasOnly')]"
},
"copy": {
"name": "customScriptLoop",
"count": "[parameters('vmInstanceCount')]"
},
"dependsOn": [
"[concat(parameters('VMNamePrefix'), parameters('startingNumeral')[copyindex()])]"
],
"properties": {
"publisher": "Microsoft.Azure.Extensions",
"type": "CustomScript",
"typeHandlerVersion": "2.0",
"autoUpgradeMinorVersion": true,
"settings": {
"fileUris": [
"[variables('customScriptUri')]"
]
},
"protectedSettings": {
"commandToExecute": "[parameters('customScriptCommand')]"
}
}
}
The command to execute is pwd but after like 90 minutes, the extension gives up and I see the following in the waagent.log log file on Red Hat 7.2 -
2018/09/10 13:30:49.361162 INFO [Microsoft.Compute.CustomScriptExtension-1.9.1] Target handler state: enabled
2018/09/10 13:30:49.390061 INFO [Microsoft.Compute.CustomScriptExtension-1.9.1] [Enable] current handler state is: notinstalled
2018/09/10 13:30:49.585331 INFO [Microsoft.Compute.CustomScriptExtension-1.9.1] Initialize extension directory
2018/09/10 13:30:49.615784 INFO [Microsoft.Compute.CustomScriptExtension-1.9.1] Update settings file: 0.settings
2018/09/10 13:30:49.644631 INFO [Microsoft.Compute.CustomScriptExtension-1.9.1] Install extension [install.cmd]
2018/09/10 13:30:50.678474 WARNING [Microsoft.Compute.CustomScriptExtension-1.9.1] [ExtensionError] Non-zero exit code: 127, install.cmd
2018/09/10 13:30:50.713928 INFO [Microsoft.Compute.CustomScriptExtension-1.9.1] Remove extension handler directory: /var/lib/waagent/Microsoft.Compute.CustomScriptExtension-1.9.1
2018/09/10 13:30:50.723392 INFO ExtHandler ProcessGoalState completed [incarnation 4; 1534 ms]
I am not seeing any other logs as well. Any idea what could be going wrong? When I manually go and install the Custom Script extension from the portal, it works fine.
Thanks,
Pranav

Logstash output not recognising columns in kibana

I'm trying to get my .CSV file in Kibana for visualisation. It feels like I'm close to get it work but I can`t figure out how to get my output right.
In Kibana I see my .csv file as:
message: News,test#email.com,10.10.10.10
It looks like my CSV ouput is in 1 field called message. I would like to get 3 different fields: Name,Email,IP. I have tried a lot of csv files and different codes but no success yet.
CSV FILE:
Name,Email,IP
Auto,auto#newsuk,10.0.0.196
News,test#email.com,10.10.10.10
nieuwsbrieven,nieuwsbrieven#nl,10.10.10.10
CONF file:
input {
file {
path => "C:\Users\JOEY2\Desktop\Deelproblemen\Applicatie\Output\test.csv"
start_position => beginning
sincedb_path => "/dev/null"
}}
filter {
csv {
separator => ","
columns => ["Date","Open","High"]
}
}
output {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "csv_index"
}
stdout {}
}
filebeat.yml
filebeat.prospectors:
- input_type: log
paths:
- C:\Users\JOEY2\Desktop\Deelproblemen\Applicatie\Output\test.csv
output.elasticsearch:
hosts: ["localhost:9200"]
template.name: "testttt"
template.overwrite: true
output.logstash:
hosts: ["localhost:5044"]
Logstash CMD output:
[2017-10-12T13:53:52,682][INFO ][logstash.pipeline ] Pipeline main started
[2017-10-12T13:53:52,690][INFO ][org.logstash.beats.Server] Starting server on port: 5044
[2017-10-12T13:53:53,003][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
{
"#timestamp" => 2017-10-12T11:53:53.659Z,
"offset" => 15,
"#version" => "1",
"input_type" => "log",
"beat" => {
"name" => "DESKTOP-VEQHHVT",
"hostname" => "DESKTOP-VEQHHVT",
"version" => "5.6.2"
},
"host" => "DESKTOP-VEQHHVT",
"source" => "C:\\Users\\JOEY2\\Desktop\\Deelproblemen\\Applicatie\\Output\\test.csv",
"message" => "Name,Email,IP",
"type" => "log",
"tags" => [
[0] "beats_input_codec_plain_applied"
]
}
{
"#timestamp" => 2017-10-12T11:53:53.659Z,
"offset" => 44,
"#version" => "1",
"input_type" => "log",
"beat" => {
"name" => "DESKTOP-VEQHHVT",
"hostname" => "DESKTOP-VEQHHVT",
"version" => "5.6.2"
},
"host" => "DESKTOP-VEQHHVT",
"source" => "C:\\Users\\JOEY2\\Desktop\\Deelproblemen\\Applicatie\\Output\\test.csv",
"message" => "Auto,auto#newsuk,10.0.0.196",
"type" => "log",
"tags" => [
[0] "beats_input_codec_plain_applied"
]
}
{
"#timestamp" => 2017-10-12T11:53:53.659Z,
"offset" => 77,
"#version" => "1",
"beat" => {
"name" => "DESKTOP-VEQHHVT",
"hostname" => "DESKTOP-VEQHHVT",
"version" => "5.6.2"
},
"input_type" => "log",
"host" => "DESKTOP-VEQHHVT",
"source" => "C:\\Users\\JOEY2\\Desktop\\Deelproblemen\\Applicatie\\Output\\test.csv",
"message" => "News,test#email.com,10.10.10.10",
"type" => "log",
"tags" => [
[0] "beats_input_codec_plain_applied"
]
Al my CSV colums/rows are getting in the variable message.
Curl command output: (curl -s localhost:9200/_cat/indices?v)
yellow open filebeat-2017.10.12 ux6-ByOERj-2XEBojkxhXg 5 1 3 0 13.3kb 13.3kb
enter code here
Terminal ELAC OUTPUT:
[2017-10-12T13:53:11,763][INFO ][o.e.n.Node ] [] initializing ...
[2017-10-12T13:53:11,919][INFO ][o.e.e.NodeEnvironment ] [Zs6ZAuy] using [1] data paths, mounts [[(C:)]], net usable_space [1.9tb], net total_space [1.9tb], spins? [unknown], types [NTFS]
[2017-10-12T13:53:11,920][INFO ][o.e.e.NodeEnvironment ] [Zs6ZAuy] heap size [1.9gb], compressed ordinary object pointers [true]
[2017-10-12T13:53:12,126][INFO ][o.e.n.Node ] node name [Zs6ZAuy] derived from node ID [Zs6ZAuyyR2auGVnPoD9gRw]; set [node.name] to override
[2017-10-12T13:53:12,128][INFO ][o.e.n.Node ] version[5.6.2], pid[3384], build[57e20f3/2017-09-23T13:16:45.703Z], OS[Windows 10/10.0/amd64], JVM[Oracle Corporation/Java HotSpot(TM) 64-Bit Server VM/1.8.0_144/25.144-b01]
[2017-10-12T13:53:12,128][INFO ][o.e.n.Node ] JVM arguments [-Xms2g, -Xmx2g, -XX:+UseConcMarkSweepGC, -XX:CMSInitiatingOccupancyFraction=75, -XX:+UseCMSInitiatingOccupancyOnly, -XX:+AlwaysPreTouch, -Xss1m, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djna.nosys=true, -Djdk.io.permissionsUseCanonicalPath=true, -Dio.netty.noUnsafe=true, -Dio.netty.noKeySetOptimization=true, -Dio.netty.recycler.maxCapacityPerThread=0, -Dlog4j.shutdownHookEnabled=false, -Dlog4j2.disable.jmx=true, -Dlog4j.skipJansi=true, -XX:+HeapDumpOnOutOfMemoryError, -Delasticsearch, -Des.path.home=C:\ELK-Stack\elasticsearch\elasticsearch-5.6.2]
[2017-10-12T13:53:13,550][INFO ][o.e.p.PluginsService ] [Zs6ZAuy] loaded module [aggs-matrix-stats]
[2017-10-12T13:53:13,616][INFO ][o.e.p.PluginsService ] [Zs6ZAuy] loaded module [ingest-common]
[2017-10-12T13:53:13,722][INFO ][o.e.p.PluginsService ] [Zs6ZAuy] loaded module [lang-expression]
[2017-10-12T13:53:13,798][INFO ][o.e.p.PluginsService ] [Zs6ZAuy] loaded module [lang-groovy]
[2017-10-12T13:53:13,886][INFO ][o.e.p.PluginsService ] [Zs6ZAuy] loaded module [lang-mustache]
[2017-10-12T13:53:13,988][INFO ][o.e.p.PluginsService ] [Zs6ZAuy] loaded module [lang-painless]
[2017-10-12T13:53:14,059][INFO ][o.e.p.PluginsService ] [Zs6ZAuy] loaded module [parent-join]
[2017-10-12T13:53:14,154][INFO ][o.e.p.PluginsService ] [Zs6ZAuy] loaded module [percolator]
[2017-10-12T13:53:14,223][INFO ][o.e.p.PluginsService ] [Zs6ZAuy] loaded module [reindex]
[2017-10-12T13:53:14,289][INFO ][o.e.p.PluginsService ] [Zs6ZAuy] loaded module [transport-netty3]
[2017-10-12T13:53:14,360][INFO ][o.e.p.PluginsService ] [Zs6ZAuy] loaded module [transport-netty4]
[2017-10-12T13:53:14,448][INFO ][o.e.p.PluginsService ] [Zs6ZAuy] no plugins loaded
[2017-10-12T13:53:18,328][INFO ][o.e.d.DiscoveryModule ] [Zs6ZAuy] using discovery type [zen]
[2017-10-12T13:53:19,204][INFO ][o.e.n.Node ] initialized
[2017-10-12T13:53:19,204][INFO ][o.e.n.Node ] [Zs6ZAuy] starting ...
[2017-10-12T13:53:20,071][INFO ][o.e.t.TransportService ] [Zs6ZAuy] publish_address {127.0.0.1:9300}, bound_addresses {127.0.0.1:9300}, {[::1]:9300}
[2017-10-12T13:53:23,130][INFO ][o.e.c.s.ClusterService ] [Zs6ZAuy] new_master {Zs6ZAuy}{Zs6ZAuyyR2auGVnPoD9gRw}{jBwTE7rUS4i_Ugh6k6DAMg}{127.0.0.1}{127.0.0.1:9300}, reason: zen-disco-elected-as-master ([0] nodes joined)
[2017-10-12T13:53:23,883][INFO ][o.e.g.GatewayService ] [Zs6ZAuy] recovered [5] indices into cluster_state
[2017-10-12T13:53:25,962][INFO ][o.e.c.r.a.AllocationService] [Zs6ZAuy] Cluster health status changed from [RED] to [YELLOW] (reason: [shards started [[.kibana][0]] ...]).
[2017-10-12T13:53:25,981][INFO ][o.e.h.n.Netty4HttpServerTransport] [Zs6ZAuy] publish_address {127.0.0.1:9200}, bound_addresses {127.0.0.1:9200}, {[::1]:9200}
[2017-10-12T13:53:25,986][INFO ][o.e.n.Node ] [Zs6ZAuy] started
[2017-10-12T13:53:59,245][INFO ][o.e.c.m.MetaDataCreateIndexService] [Zs6ZAuy] [filebeat-2017.10.12] creating index, cause [auto(bulk api)], templates [filebeat, testttt], shards [5]/[1], mappings [_default_]
[2017-10-12T13:53:59,721][INFO ][o.e.c.m.MetaDataMappingService] [Zs6ZAuy] [filebeat-2017.10.12/ux6-ByOERj-2XEBojkxhXg] create_mapping [doc]
Filebeat output:
C:\ELK-Stack\filebeat>filebeat -e -c filebeat.yml -d "publish"
2017/10/12 11:53:53.632142 beat.go:297: INFO Home path: [C:\ELK-Stack\filebeat] Config path: [C:\ELK-Stack\filebeat] Data path: [C:\ELK-Stack\filebeat\data] Logs path: [C:\ELK-Stack\filebeat\logs]
2017/10/12 11:53:53.632142 beat.go:192: INFO Setup Beat: filebeat; Version: 5.6.2
2017/10/12 11:53:53.634143 publish.go:228: WARN Support for loading more than one output is deprecated and will not be supported in version 6.0.
2017/10/12 11:53:53.635144 output.go:258: INFO Loading template enabled. Reading template file: C:\ELK-Stack\filebeat\filebeat.template.json
2017/10/12 11:53:53.636144 output.go:269: INFO Loading template enabled for Elasticsearch 2.x. Reading template file: C:\ELK-Stack\filebeat\filebeat.template-es2x.json
2017/10/12 11:53:53.637143 output.go:281: INFO Loading template enabled for Elasticsearch 6.x. Reading template file: C:\ELK-Stack\filebeat\filebeat.template-es6x.json
2017/10/12 11:53:53.638144 client.go:128: INFO Elasticsearch url: http://localhost:9200
2017/10/12 11:53:53.639143 outputs.go:108: INFO Activated elasticsearch as output plugin.
2017/10/12 11:53:53.639143 logstash.go:90: INFO Max Retries set to: 3
2017/10/12 11:53:53.640143 outputs.go:108: INFO Activated logstash as output plugin.
2017/10/12 11:53:53.640143 publish.go:243: DBG Create output worker
2017/10/12 11:53:53.641143 publish.go:243: DBG Create output worker
2017/10/12 11:53:53.641143 publish.go:285: DBG No output is defined to store the topology. The server fields might not be filled.
2017/10/12 11:53:53.642144 publish.go:300: INFO Publisher name: DESKTOP-VEQHHVT
2017/10/12 11:53:53.634143 metrics.go:23: INFO Metrics logging every 30s
2017/10/12 11:53:53.646143 async.go:63: INFO Flush Interval set to: 1s
2017/10/12 11:53:53.647142 async.go:64: INFO Max Bulk Size set to: 50
2017/10/12 11:53:53.647142 async.go:72: DBG create bulk processing worker (interval=1s, bulk size=50)
2017/10/12 11:53:53.648144 async.go:63: INFO Flush Interval set to: 1s
2017/10/12 11:53:53.648144 async.go:64: INFO Max Bulk Size set to: 2048
2017/10/12 11:53:53.649144 async.go:72: DBG create bulk processing worker (interval=1s, bulk size=2048)
2017/10/12 11:53:53.649144 beat.go:233: INFO filebeat start running.
2017/10/12 11:53:53.650144 registrar.go:68: INFO No registry file found under: C:\ELK-Stack\filebeat\data\registry. Creating a new registry file.
2017/10/12 11:53:53.652144 registrar.go:106: INFO Loading registrar data from C:\ELK-Stack\filebeat\data\registry
2017/10/12 11:53:53.654145 registrar.go:123: INFO States Loaded from registrar: 0
2017/10/12 11:53:53.655145 crawler.go:38: INFO Loading Prospectors: 1
2017/10/12 11:53:53.655145 prospector_log.go:65: INFO Prospector with previous states loaded: 0
2017/10/12 11:53:53.656144 prospector.go:124: INFO Starting prospector of type: log; id: 11034545279404679229
2017/10/12 11:53:53.656144 crawler.go:58: INFO Loading and starting Prospectors completed. Enabled prospectors: 1
2017/10/12 11:53:53.655145 spooler.go:63: INFO Starting spooler: spool_size: 2048; idle_timeout: 5s
2017/10/12 11:53:53.655145 registrar.go:236: INFO Starting Registrar
2017/10/12 11:53:53.657144 log.go:91: INFO Harvester started for file: C:\Users\JOEY2\Desktop\Deelproblemen\Applicatie\Output\test.csv
2017/10/12 11:53:53.655145 sync.go:41: INFO Start sending events to output
2017/10/12 11:53:58.682432 client.go:214: DBG Publish: {
"#timestamp": "2017-10-12T11:53:53.659Z",
"beat": {
"hostname": "DESKTOP-VEQHHVT",
"name": "DESKTOP-VEQHHVT",
"version": "5.6.2"
},
"input_type": "log",
"message": "Name,Email,IP",
"offset": 15,
"source": "C:\\Users\\JOEY2\\Desktop\\Deelproblemen\\Applicatie\\Output\\test.csv",
"type": "log"
}
2017/10/12 11:53:58.685434 client.go:214: DBG Publish: {
"#timestamp": "2017-10-12T11:53:53.659Z",
"beat": {
"hostname": "DESKTOP-VEQHHVT",
"name": "DESKTOP-VEQHHVT",
"version": "5.6.2"
},
"input_type": "log",
"message": "Auto,auto#newsuk,10.0.0.196",
"offset": 44,
"source": "C:\\Users\\JOEY2\\Desktop\\Deelproblemen\\Applicatie\\Output\\test.csv",
"type": "log"
}
2017/10/12 11:53:58.685434 client.go:214: DBG Publish: {
"#timestamp": "2017-10-12T11:53:53.659Z",
"beat": {
"hostname": "DESKTOP-VEQHHVT",
"name": "DESKTOP-VEQHHVT",
"version": "5.6.2"
},
"input_type": "log",
"message": "News,test#email.com,10.10.10.10",
"offset": 77,
"source": "C:\\Users\\JOEY2\\Desktop\\Deelproblemen\\Applicatie\\Output\\test.csv",
"type": "log"
}
2017/10/12 11:53:58.686434 output.go:109: DBG output worker: publish 3 events
2017/10/12 11:53:58.686434 output.go:109: DBG output worker: publish 3 events
2017/10/12 11:53:58.738437 client.go:667: INFO Connected to Elasticsearch version 5.6.2
2017/10/12 11:53:58.748436 output.go:317: INFO Trying to load template for client: http://localhost:9200
2017/10/12 11:53:58.890446 output.go:324: INFO Existing template will be overwritten, as overwrite is enabled.
2017/10/12 11:53:59.154461 client.go:592: INFO Elasticsearch template with name 'testttt' loaded
2017/10/12 11:54:00.020510 sync.go:70: DBG Events sent: 4
Kibana output:
#timestamp:October 12th 2017, 13:53:53.659 beat.hostname:DESKTOP-VEQHHVT beat.name:DESKTOP-VEQHHVT beat.version:5.6.2 input_type:log message:Auto,auto#newsuk,10.0.0.196 offset:44 source:C:\Users\JOEY2\Desktop\Deelproblemen\Applicatie\Output\test.csv type:log _id:AV8QbyIcTtSiVplm9CwA _type:doc _index:filebeat-2017.10.12 _score:1
#timestamp:October 12th 2017, 13:53:53.659 beat.hostname:DESKTOP-VEQHHVT beat.name:DESKTOP-VEQHHVT beat.version:5.6.2 input_type:log message:News,test#email.com,10.10.10.10 offset:77 source:C:\Users\JOEY2\Desktop\Deelproblemen\Applicatie\Output\test.csv type:log _id:AV8QbyIcTtSiVplm9CwB _type:doc _index:filebeat-2017.10.12 _score:1
#timestamp:October 12th 2017, 13:53:53.659 beat.hostname:DESKTOP-VEQHHVT beat.name:DESKTOP-VEQHHVT beat.version:5.6.2 input_type:log message:Name,Email,IP offset:15 source:C:\Users\JOEY2\Desktop\Deelproblemen\Applicatie\Output\test.csv type:log _id:AV8QbyIcTtSiVplm9Cv_ _type:doc _index:filebeat-2017.10.12 _score:1
You are giving wrong column names in csv filters and column name should be given without double quotes(").
I have tried this and it is working for me. Check if this is working for you. My logstash config file:
input {
file {
path => "/home/quality/Desktop/work/csv.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => [Name,Email,IP]
}
}
output {
elasticsearch {
hosts => "localhost"
index => "csv"
document_type => "csv"
}
stdout { codec => rubydebug}
}

Resources