2017-09-15 107 views
0

的Splunk版本 6.5.3.1詹金斯工作控制檯輸出去/由Splunk的監控 - 如何使用curl/REST API的最後N分鐘/小時/天獲取的數據等

Splunk的構建 bf0ff7c2ab8b

詹金斯版本 1.642.3或2.32.3

在每個詹金斯大師,有一個Splunk的進程在運行。

$ ps -eAf|grep splunk 
splunk 58877  1 20 Feb16 ?  42-23:27:37 splunkd -p 8089 restart 
splunk 58878 58877 0 Feb16 ?  00:00:00 [splunkd pid=58877] splunkd -p 8089 restart [process-runner] 
asangal 91197 91175 0 12:38 pts/2 00:00:00 grep --color=auto splunk 

Splunk的過程監視器/爲log文件,我們有我們的情況下,即下$JENKINS_HOME/jobs/<JOB_NAME>/builds/<BUILD_NUMBER>/log文件中的任何詹金斯作業掃描。

$ pwd 
/opt/splunkforwarder/etc/system/local 
$ cat inputs.conf 
[default] 
host = jenkins-master-project-prod-1-609 

[monitor:///var/log/jenkins] 
index = some-jenkins-prod-index-in-splunk 
disabled = False 
recursive = True 

[monitor:///home/jenkins/jobs/.../builds/.../log] 
index = some-jenkins-prod-index-in-splunk 
disabled = False 
recursive = True 
crcSalt = <SOURCE> 

... 
..... 
... more config code here ... 
.... 
.. 

在Splunk的GUI,當我運行一個簡單的查詢,以尋找那些Splunk的捕獲相同index以及從任何source(文件)來什麼,我確實看到有效outupt。 注意:實際的行輸出被截斷。正如你可以看到條形圖一樣,數據在那裏,表格被填充。 enter image description here

在我的詹金斯的工作,我有時會得到一些警告,信息,錯誤(我已經在詹金斯級別使用Log Parser Plugin爲此),我試圖寫一個腳本,將獲取的日誌的輸出Jenkins在Splunk工作了15,30分鐘或者1-7小時或1-30天,並且在給定的時間內發現了多少警告,錯誤等(基於某些關鍵字,正則表達式)。注:有許多這樣的詹金斯大師,Splunk正在運行,我的目標是與Splunk交談並獲取我需要的數據(而不是與500個詹金斯大師交談)。

我試過以下CURL命令,它返回給我一個SEARCH ID,​​但它沒有做任何事情。

在下面的CURL命令中,我傳遞了一個更精細的查詢來獲取數據。我表示在最後30分鐘內獲取Splunk擁有的所有信息(可根據GUI添加字段),其中indexsome-jenkins-prod-index-in-splunk,其中source的日誌爲:/home/jenkins/jobs/*/builds/*/log(第一個*爲作業名稱,第二個爲*爲內部版本號),然後我說搜索日誌中的日誌包含行/關鍵字/正則表達式(如下面列出的使用OR條件),並以JSON格式顯示輸出。

➜ ~ p=$(cat ~/AKS/rOnly/od.p.txt) 
➜ ~ curl --connect-time 10 --max-time 900 -ks https://splunk.server.mycompany.com:8089/services/search -umy_splunk_user:$p --data search='search earliest=-30m index=some-jenkins-prod-index-in-splunk source=/home/jenkins/jobs/*/builds/*/log ("WARNING: " OR "npm WARN retry" OR "svn: E200033: " OR ": binary operator expected" OR ": too many arguments" OR ": No such file or directory" OR "rsync: failed to set times on")' -d output_mode=json 
{"messages":[{"type":"ERROR","text":"Method Not Allowed"}]}%                     ➜ ~ 

正如你所看到的,它給我的方法不允許。

當我在URL部分給出了以下查詢/jobs時,我得到了一個有效的SEARCH ID。

➜ ~ curl --connect-time 10 --max-time 900 -ks https://splunk.server.mycompany.com:8089/services/search/jobs -umy_splunk_user:$p --data search='search earliest=-30m index=some-jenkins-prod-index-in-splunk source=/home/jenkins/jobs/*/builds/*/log ("WARNING: " OR "npm WARN retry" OR "svn: E200033: " OR ": binary operator expected" OR ": too many arguments" OR ": No such file or directory" OR "rsync: failed to set times on")' -d output_mode=json 
{"sid":"1505493838.3723_ACEB82F4-AA21-4AE2-95A3-566F6BCAA05A"}% 

使用此搜索ID,我試圖去主要的日誌,但它不工作。我正在使用jq來過濾JSON輸出以顯示一個不錯的佈局。

➜ ~ curl --connect-time 10 --max-time 900 -ks https://splunk.server.mycompany.com:8089/services/search/jobs/1505493838.3723_ACEB82F4-AA21-4AE2-95A3-566F6BCAA05A -umy_splunk_user:$p --data search='search earliest=-30m index=some-jenkins-prod-index-in-splunk source=/home/jenkins/jobs/*/builds/*/log ("WARNING: " OR "npm WARN retry" OR "svn: E200033: " OR ": binary operator expected" OR ": too many arguments" OR ": No such file or directory" OR "rsync: failed to set times on")' -d output_mode=json|jq . 
{ 
    "links": {}, 
    "origin": "http://splunk.server.mycompany.com/services/search/jobs", 
    "updated": "2017-09-15T09:44:33-07:00", 
    "generator": { 
    "build": "bf0ff7c2ab8b", 
    "version": "6.5.3.1" 
    }, 
    "entry": [ 
    { 
     "name": "search earliest=-30m index=some-jenkins-prod-index-in-splunk source=/home/jenkins/jobs/*/builds/*/log (\"WARNING: \" OR \"npm WARN retry\" OR \"svn: E200033: \" OR \": binary operator expected\" OR \": too many arguments\" OR \": No such file or directory\" OR \"rsync: failed to set times on\") | regex source=\".*/[0-9][0-9]*/log\" | table host, source, _raw", 
     "id": "http://splunk.server.mycompany.com/services/search/jobs/1505493838.3723_ACEB82F4-AA21-4AE2-95A3-566F6BCAA05A", 
     "updated": "2017-09-15T09:44:33.942-07:00", 
     "links": { 
     "alternate": "/services/search/jobs/1505493838.3723_ACEB82F4-AA21-4AE2-95A3-566F6BCAA05A", 
     "search.log": "/services/search/jobs/1505493838.3723_ACEB82F4-AA21-4AE2-95A3-566F6BCAA05A/search.log", 
     "events": "/services/search/jobs/1505493838.3723_ACEB82F4-AA21-4AE2-95A3-566F6BCAA05A/events", 
     "results": "/services/search/jobs/1505493838.3723_ACEB82F4-AA21-4AE2-95A3-566F6BCAA05A/results", 
     "results_preview": "/services/search/jobs/1505493838.3723_ACEB82F4-AA21-4AE2-95A3-566F6BCAA05A/results_preview", 
     "timeline": "/services/search/jobs/1505493838.3723_ACEB82F4-AA21-4AE2-95A3-566F6BCAA05A/timeline", 
     "summary": "/services/search/jobs/1505493838.3723_ACEB82F4-AA21-4AE2-95A3-566F6BCAA05A/summary", 
     "control": "/services/search/jobs/1505493838.3723_ACEB82F4-AA21-4AE2-95A3-566F6BCAA05A/control" 
     }, 
     "published": "2017-09-15T09:43:59.000-07:00", 
     "author": "my_splunk_user", 
     "content": { 
     "bundleVersion": "17557160226808436058", 
     "canSummarize": false, 
     "cursorTime": "1969-12-31T16:00:00.000-08:00", 
     "defaultSaveTTL": "2592000", 
     "defaultTTL": "600", 
     "delegate": "", 
     "diskUsage": 561152, 
     "dispatchState": "DONE", 
     "doneProgress": 1, 
     "dropCount": 0, 
     "earliestTime": "2017-09-15T09:13:58.000-07:00", 
     "eventAvailableCount": 0, 
     "eventCount": 30, 
     "eventFieldCount": 0, 
     "eventIsStreaming": true, 
     "eventIsTruncated": true, 
     "eventSearch": "search (earliest=-30m index=some-jenkins-prod-index-in-splunk source=/home/jenkins/jobs/*/builds/*/log (\"WARNING: \" OR \"npm WARN retry\" OR \"svn: E200033: \" OR \": binary operator expected\" OR \": too many arguments\" OR \": No such file or directory\" OR \"rsync: failed to set times on\")) | regex source=\".*/[0-9][0-9]*/log\" ", 
     "eventSorting": "none", 
     "isBatchModeSearch": true, 
     "isDone": true, 
     "isEventsPreviewEnabled": false, 
     "isFailed": false, 
     "isFinalized": false, 
     "isPaused": false, 
     "isPreviewEnabled": false, 
     "isRealTimeSearch": false, 
     "isRemoteTimeline": false, 
     "isSaved": false, 
     "isSavedSearch": false, 
     "isTimeCursored": true, 
     "isZombie": false, 
     "keywords": "\"*: binary operator expected*\" \"*: no such file or directory*\" \"*: too many arguments*\" \"*npm warn retry*\" \"*rsync: failed to set times on*\" \"*svn: e200033: *\" \"*warning: *\" earliest::-30m index::some-jenkins-prod-index-in-splunk source::/home/jenkins/jobs/*/builds/*/log", 
     "label": "", 
     "latestTime": "2017-09-15T09:43:59.561-07:00", 
     "normalizedSearch": "litsearch (index=some-jenkins-prod-index-in-splunk source=/home/jenkins/jobs/*/builds/*/log (\"WARNING: \" OR \"npm WARN retry\" OR \"svn: E200033: \" OR \": binary operator expected\" OR \": too many arguments\" OR \": No such file or directory\" OR \"rsync: failed to set times on\") _time>=1505492038.000) | regex source=\".*/[0-9][0-9]*/log\" | fields keepcolorder=t \"_raw\" \"host\" \"source\"", 
     "numPreviews": 0, 
     "optimizedSearch": "| search (earliest=-30m index=some-jenkins-prod-index-in-splunk source=/home/jenkins/jobs/*/builds/*/log (\"WARNING: \" OR \"npm WARN retry\" OR \"svn: E200033: \" OR \": binary operator expected\" OR \": too many arguments\" OR \": No such file or directory\" OR \"rsync: failed to set times on\")) | regex source=\".*/[0-9][0-9]*/log\" | table host, source, _raw", 
     "pid": "2174", 
     "priority": 5, 
     "remoteSearch": "litsearch (index=some-jenkins-prod-index-in-splunk source=/home/jenkins/jobs/*/builds/*/log (\"WARNING: \" OR \"npm WARN retry\" OR \"svn: E200033: \" OR \": binary operator expected\" OR \": too many arguments\" OR \": No such file or directory\" OR \"rsync: failed to set times on\") _time>=1505492038.000) | regex source=\".*/[0-9][0-9]*/log\" | fields keepcolorder=t \"_raw\" \"host\" \"source\"", 
     "reportSearch": "table host, source, _raw", 
     "resultCount": 30, 
     "resultIsStreaming": false, 
     "resultPreviewCount": 30, 
     "runDuration": 0.579, 
     "sampleRatio": "1", 
     "sampleSeed": "0", 
     "scanCount": 301, 
     "searchCanBeEventType": false, 
     "searchEarliestTime": 1505492038, 
     "searchLatestTime": 1505493839.21872, 
     "searchTotalBucketsCount": 37, 
     "searchTotalEliminatedBucketsCount": 0, 
     "sid": "1505493838.3723_ACEB82F4-AA21-4AE2-95A3-566F6BCAA05A", 
     "statusBuckets": 0, 
     "ttl": 600, 
     "performance": { 
      "command.fields": { 
      "duration_secs": 0.035, 
      "invocations": 48, 
      "input_count": 30, 
      "output_count": 30 
      }, 
      "command.regex": { 
      "duration_secs": 0.048, 
      "invocations": 48, 
      "input_count": 30, 
      "output_count": 30 
      }, 
      "command.search": { 
      "duration_secs": 1.05, 
      "invocations": 48, 
      "input_count": 0, 
      "output_count": 30 
      }, 
      "command.search.calcfields": { 
      "duration_secs": 0.013, 
      "invocations": 16, 
      "input_count": 301, 
      "output_count": 301 
      }, 
      "dispatch.optimize.reparse": { 
      "duration_secs": 0.001, 
      "invocations": 1 
      }, 
      "dispatch.optimize.toJson": { 
      "duration_secs": 0.001, 
      "invocations": 1 
      }, 
      "dispatch.optimize.toSpl": { 
      "duration_secs": 0.001, 
      "invocations": 1 
      }, 
      "dispatch.parserThread": { 
      "duration_secs": 0.048, 
      "invocations": 48 
      }, 
      "dispatch.reduce": { 
      "duration_secs": 0.001, 
      "invocations": 1 
      }, 
      "dispatch.stream.remote": { 
      "duration_secs": 1.05, 
      "invocations": 48, 
      "input_count": 0, 
      "output_count": 332320 
      }, 
      "dispatch.stream.remote.mr11p01if-ztbv02090901.mr.if.mycompany.com-8081": { 
      "duration_secs": 0.001, 
      "invocations": 1, 
      "input_count": 0, 
      "output_count": 5422 
      }, 
      "dispatch.stream.remote.mr11p01if-ztbv02090901.mr.if.mycompany.com-8082": { 
      "duration_secs": 0.001, 
      "invocations": 1, 
      "input_count": 0, 
      "output_count": 5422 
      }, 
      "dispatch.stream.remote.mr11p01if-ztbv11204201.mr.if.mycompany.com-8081": { 
      "duration_secs": 0.001, 
      "invocations": 1, 
      "input_count": 0, 
      "output_count": 5422 
      }, 
      "dispatch.stream.remote.mr11p01if-ztbv11204201.mr.if.mycompany.com-8082": { 
      "duration_secs": 0.001, 
      "invocations": 1, 
      "input_count": 0, 
      "output_count": 5422 
      }, 
      "dispatch.stream.remote.mr11p01if-ztbv11204401.mr.if.mycompany.com-8081": { 
      "duration_secs": 0.001, 
      "invocations": 1, 
      "input_count": 0, 
      "output_count": 5422 
      }, 
      "dispatch.stream.remote.mr11p01if-ztbv11204401.mr.if.mycompany.com-8082": { 
      "duration_secs": 0.001, 
      "invocations": 1, 
      "input_count": 0, 
      "output_count": 5422 
      }, 
      "dispatch.stream.remote.mr11p01if-ztbv16142101.mr.if.mycompany.com-8081": { 
      "duration_secs": 0.001, 
      "invocations": 1, 
      "input_count": 0, 
      "output_count": 5422 
      }, 
      "dispatch.stream.remote.mr11p01if-ztbv16142101.mr.if.mycompany.com-8082": { 
      "duration_secs": 0.001, 
      "invocations": 1, 
      "input_count": 0, 
      "output_count": 5422 
      }, 
      "dispatch.stream.remote.mr11p01if-ztbv16142301.mr.if.mycompany.com-8081": { 
      "duration_secs": 0.001, 
      "invocations": 1, 
      "input_count": 0, 
      "output_count": 5422 
      }, 
      "dispatch.stream.remote.mr11p01if-ztbv16142301.mr.if.mycompany.com-8082": { 
      "duration_secs": 0.001, 
      "invocations": 1, 
      "input_count": 0, 
      "output_count": 5422 
      }, 
      "dispatch.stream.remote.mr21p01if-ztbv14080101.mr.if.mycompany.com-8081": { 
      "duration_secs": 0.001, 
      "invocations": 1, 
      "input_count": 0, 
      "output_count": 5422 
      }, 
      "dispatch.stream.remote.mr21p01if-ztbv14080101.mr.if.mycompany.com-8082": { 
      "duration_secs": 0.001, 
      "invocations": 1, 
      "input_count": 0, 
      "output_count": 5422 
      }, 
      "dispatch.stream.remote.mr22p01if-ztbv1.mr.if.mycompany.com-8081": { 
      "duration_secs": 0.001, 
      "invocations": 1, 
      "input_count": 0, 
      "output_count": 5422 
      }, 
      "dispatch.stream.remote.mr22p01if-ztbv1.mr.if.mycompany.com-8082": { 
      "duration_secs": 0.001, 
      "invocations": 1, 
      "input_count": 0, 
      "output_count": 5422 
      }, 
      "dispatch.stream.remote.mr22p01if-ztbv09013201.mr.if.mycompany.com-8081": { 
      "duration_secs": 0.001, 
      "invocations": 1, 
      "input_count": 0, 
      "output_count": 5422 
      }, 
      "dispatch.stream.remote.mr22p01if-ztbv09013201.mr.if.mycompany.com-8082": { 
      "duration_secs": 0.001, 
      "invocations": 1, 
      "input_count": 0, 
      "output_count": 5422 
      }, 
      "dispatch.stream.remote.mr90p01if-ztep02103701.mr.if.mycompany.com-8081": { 
      "duration_secs": 0.058, 
      "invocations": 2, 
      "input_count": 0, 
      "output_count": 16948 
      }, 
      "dispatch.stream.remote.mr90p01if-ztep02103701.mr.if.mycompany.com-8082": { 
      "duration_secs": 0.066, 
      "invocations": 2, 
      "input_count": 0, 
      "output_count": 14415 
      }, 
      "dispatch.stream.remote.mr90p01if-ztep04044101.mr.if.mycompany.com-8081": { 
      "duration_secs": 0.059, 
      "invocations": 2, 
      "input_count": 0, 
      "output_count": 15858 
      }, 
      "dispatch.stream.remote.mr90p01if-ztep04044101.mr.if.mycompany.com-8082": { 
      "duration_secs": 0.065, 
      "invocations": 2, 
      "input_count": 0, 
      "output_count": 11867 
      }, 
      "dispatch.stream.remote.mr90p01if-ztep06024101.mr.if.mycompany.com-8081": { 
      "duration_secs": 0.061, 
      "invocations": 2, 
      "input_count": 0, 
      "output_count": 20695 
      }, 
      "dispatch.stream.remote.mr90p01if-ztep06024101.mr.if.mycompany.com-8082": { 
      "duration_secs": 0.06, 
      "invocations": 2, 
      "input_count": 0, 
      "output_count": 15193 
      }, 
      "dispatch.stream.remote.mr90p01if-ztep12023601.mr.if.mycompany.com-8081": { 
      "duration_secs": 0.063, 
      "invocations": 2, 
      "input_count": 0, 
      "output_count": 15932 
      }, 
      "dispatch.stream.remote.mr90p01if-ztep12023601.mr.if.mycompany.com-8082": { 
      "duration_secs": 0.064, 
      "invocations": 2, 
      "input_count": 0, 
      "output_count": 14415 
      }, 
      "dispatch.stream.remote.mr90p01if-ztep12043901.mr.if.mycompany.com-8081": { 
      "duration_secs": 0.061, 
      "invocations": 2, 
      "input_count": 0, 
      "output_count": 15418 
      }, 
      "dispatch.stream.remote.mr90p01if-ztep12043901.mr.if.mycompany.com-8082": { 
      "duration_secs": 0.058, 
      "invocations": 2, 
      "input_count": 0, 
      "output_count": 11866 
      }, 
      "dispatch.stream.remote.pv31p01if-ztbv08050801.pv.if.mycompany.com-8081": { 
      "duration_secs": 0.075, 
      "invocations": 2, 
      "input_count": 0, 
      "output_count": 15661 
      }, 
      "dispatch.stream.remote.pv31p01if-ztbv08050801.pv.if.mycompany.com-8082": { 
      "duration_secs": 0.071, 
      "invocations": 2, 
      "input_count": 0, 
      "output_count": 15845 
      }, 
      "dispatch.stream.remote.pv31p01if-ztbv08051001.pv.if.mycompany.com-8081": { 
      "duration_secs": 0.066, 
      "invocations": 2, 
      "input_count": 0, 
      "output_count": 14406 
      }, 
      "dispatch.stream.remote.pv31p01if-ztbv08051001.pv.if.mycompany.com-8082": { 
      "duration_secs": 0.072, 
      "invocations": 2, 
      "input_count": 0, 
      "output_count": 15524 
      }, 
      "dispatch.stream.remote.pv31p01if-ztbv08051201.pv.if.mycompany.com-8081": { 
      "duration_secs": 0.067, 
      "invocations": 2, 
      "input_count": 0, 
      "output_count": 16009 
      }, 
      "dispatch.stream.remote.pv31p01if-ztbv08051201.pv.if.mycompany.com-8082": { 
      "duration_secs": 0.068, 
      "invocations": 2, 
      "input_count": 0, 
      "output_count": 15516 
      }, 
      "dispatch.writeStatus": { 
      "duration_secs": 0.012, 
      "invocations": 7 
      }, 
      "startup.configuration": { 
      "duration_secs": 2.045, 
      "invocations": 33 
      }, 
      "startup.handoff": { 
      "duration_secs": 14.595, 
      "invocations": 33 
      } 
     }, 
     "messages": [ 
      { 
      "type": "INFO", 
      "text": "Your timerange was substituted based on your search string" 
      }, 
      { 
      "type": "WARN", 
      "text": "Unable to distribute to peer named pv31p01if-ztbv08050601.pv.if.mycompany.com:8081 at uri=pv31p01if-ztbv08050601.pv.if.mycompany.com:8081 using the uri-scheme=http because peer has status=\"Down\". Please verify uri-scheme, connectivity to the search peer, that the search peer is up, and an adequate level of system resources are available. See the Troubleshooting Manual for more information." 
      }, 
      { 
      "type": "WARN", 
      "text": "Unable to distribute to peer named pv31p01if-ztbv08050601.pv.if.mycompany.com:8082 at uri=pv31p01if-ztbv08050601.pv.if.mycompany.com:8082 using the uri-scheme=http because peer has status=\"Down\". Please verify uri-scheme, connectivity to the search peer, that the search peer is up, and an adequate level of system resources are available. See the Troubleshooting Manual for more information." 
      } 
     ], 
     "request": { 
      "search": "search earliest=-30m index=some-jenkins-prod-index-in-splunk source=/home/jenkins/jobs/*/builds/*/log (\"WARNING: \" OR \"npm WARN retry\" OR \"svn: E200033: \" OR \": binary operator expected\" OR \": too many arguments\" OR \": No such file or directory\" OR \"rsync: failed to set times on\") | regex source=\".*/[0-9][0-9]*/log\" | table host, source, _raw" 
     }, 
     "runtime": { 
      "auto_cancel": "0", 
      "auto_pause": "0" 
     }, 
     "searchProviders": [ 
      "mr11p01if-ztbv02090901.mr.if.mycompany.com-8081", 
      "mr11p01if-ztbv16142101.mr.if.mycompany.com-8082", 
      "mr11p01if-ztbv16142301.mr.if.mycompany.com-8081", 
      "mr11p01if-ztbv16142301.mr.if.mycompany.com-8082", 
      "mr21p01if-ztbv14080101.mr.if.mycompany.com-8081", 
      "mr21p01if-ztbv14080101.mr.if.mycompany.com-8082", 
      "mr22p01if-ztbv1.mr.if.mycompany.com-8081", 
      "mr22p01if-ztbv1.mr.if.mycompany.com-8082", 
      "mr22p01if-ztbv09013201.mr.if.mycompany.com-8081", 
      "mr22p01if-ztbv09013201.mr.if.mycompany.com-8082", 
      "mr90p01if-ztep02103701.mr.if.mycompany.com-8081", 
      "mr90p01if-ztep02103701.mr.if.mycompany.com-8082", 
      "mr90p01if-ztep04044101.mr.if.mycompany.com-8081", 
      "mr90p01if-ztep04044101.mr.if.mycompany.com-8082", 
      "mr90p01if-ztep06024101.mr.if.mycompany.com-8081", 
      "mr90p01if-ztep06024101.mr.if.mycompany.com-8082", 
      "mr90p01if-ztep12023601.mr.if.mycompany.com-8081", 
      "mr90p01if-ztep12023601.mr.if.mycompany.com-8082", 
      "mr90p01if-ztep12043901.mr.if.mycompany.com-8081", 
      "mr90p01if-ztep12043901.mr.if.mycompany.com-8082", 
      "pv31p01if-ztbv08050801.pv.if.mycompany.com-8081", 
      "pv31p01if-ztbv08050801.pv.if.mycompany.com-8082", 
      "pv31p01if-ztbv08051001.pv.if.mycompany.com-8081", 
      "pv31p01if-ztbv08051001.pv.if.mycompany.com-8082", 
      "pv31p01if-ztbv08051201.pv.if.mycompany.com-8081", 
      "pv31p01if-ztbv08051201.pv.if.mycompany.com-8082" 
     ] 
     }, 
     "acl": { 
     "perms": { 
      "read": [ 
      "my_splunk_user" 
      ], 
      "write": [ 
      "my_splunk_user" 
      ] 
     }, 
     "owner": "my_splunk_user", 
     "modifiable": true, 
     "sharing": "global", 
     "app": "search", 
     "can_write": true, 
     "ttl": "600" 
     } 
    } 
    ], 
    "paging": { 
    "total": 1, 
    "perPage": 0, 
    "offset": 0 
    } 
} 
➜ ~ 
➜ ~ 

但是,正如你所看到的,所產生的JSON輸出是沒有用的,因爲它沒有顯示或包含任何詹金斯作業的輸出,我可以使用的。

如果在CURL命令中,對於Splunk URL,如果我嘗試以下任何URL端點,則會給我一個錯誤。

"search.log": "/services/search/jobs/1505493838.3723_ACEB82F4-AA21-4AE2-95A3-566F6BCAA05A/search.log", 
    "events": "/services/search/jobs/1505493838.3723_ACEB82F4-AA21-4AE2-95A3-566F6BCAA05A/events", 
    "results": "/services/search/jobs/1505493838.3723_ACEB82F4-AA21-4AE2-95A3-566F6BCAA05A/results", 
    "results_preview": "/services/search/jobs/1505493838.3723_ACEB82F4-AA21-4AE2-95A3-566F6BCAA05A/results_preview", 
    "timeline": "/services/search/jobs/1505493838.3723_ACEB82F4-AA21-4AE2-95A3-566F6BCAA05A/timeline", 
    "summary": "/services/search/jobs/1505493838.3723_ACEB82F4-AA21-4AE2-95A3-566F6BCAA05A/summary", 
    "control": "/services/search/jobs/1505493838.3723_ACEB82F4-AA21-4AE2-95A3-566F6BCAA05A/control" 

對於前:,如果我嘗試URL.../<SEARCH_ID>/events,或URL/.../<SEARCH_ID>/results,等等,我碰到下面的錯誤。

curl --connect-time 10 --max-time 900 -ks https://splunk.server.mycompany.com:8089/services/search/jobs/1505493838.3723_ACEB82F4-AA21-4AE2-95A3-566F6BCAA05A/events -umy_splunk_user:$p --data search='search earliest=-30m index=some-jenkins-prod-index-in-splunk source=/home/jenkins/jobs/*/builds/*/log ("WARNING: " OR "npm WARN retry" OR "svn: E200033: " OR ": binary operator expected" OR ": too many arguments" OR ": No such file or directory" OR "rsync: failed to set times on")' -d output_mode=json|jq . 

{ 
    "messages": [ 
    { 
     "type": "FATAL", 
     "text": "Method Not Allowed" 
    } 
    ] 
} 

我試圖找到在最後N個時間段的主機名,源(詹金斯作業日誌的路徑),實際作業的控制檯輸出(即我可以讀取並解析生成meaniningful信息)有關,如何很多錯誤,警告,奇怪的線出現,並取決於一些閾值,如果數字跨越這些閾值,那麼我需要發送電子郵件通知。

我可以對所有這些代碼進行編碼,但是我沒有得到第一個難題,那就是讓Splunk將吐出的文件系統上監視的Jenkins作業的CONSOLE OUTPUT吐出來。

的最終目標是有意義的數據轉儲到形式或JSON或CSV文本文件和數據轉換成一些有意義的酒吧/餅圖等

對於例如:如果data.csv包含:

age,population 
<5,2704659 
5-13,4499890 
14-17,2159981 
18-24,3853788 
25-44,14106543 
45-64,8819342 
65-85,312463 
≥85,81312463 

然後使用以下文件,我可以將此原始數據轉換爲一個餅圖,其看起來像如下所示的圖像快照。

<!DOCTYPE html> 
<meta charset="utf-8"> 
<style> 

.arc text { 
    font: 10px sans-serif; 
    text-anchor: middle; 
} 

.arc path { 
    stroke: #fff; 
} 

</style> 
<svg width="960" height="500"></svg> 
<script src="https://d3js.org/d3.v4.min.js"></script> 
<script> 

var svg = d3.select("svg"), 
    width = +svg.attr("width"), 
    height = +svg.attr("height"), 
    radius = Math.min(width, height)/2, 
    g = svg.append("g").attr("transform", "translate(" + width/2 + "," + height/2 + ")"); 

var color = d3.scaleOrdinal(["#98abc5", "#8a89a6", "#7b6888", "#6b486b", "#a05d56", "#d0743c", "#ff8c00"]); 

var pie = d3.pie() 
    .sort(null) 
    .value(function(d) { return d.population; }); 

var path = d3.arc() 
    .outerRadius(radius - 10) 
    .innerRadius(0); 

var label = d3.arc() 
    .outerRadius(radius - 40) 
    .innerRadius(radius - 40); 

d3.csv("data.csv", function(d) { 
    d.population = +d.population; 
    return d; 
}, function(error, data) { 
    if (error) throw error; 

    var arc = g.selectAll(".arc") 
    .data(pie(data)) 
    .enter().append("g") 
     .attr("class", "arc"); 

    arc.append("path") 
     .attr("d", path) 
     .attr("fill", function(d) { return color(d.data.age); }); 

    arc.append("text") 
     .attr("transform", function(d) { return "translate(" + label.centroid(d) + ")"; }) 
     .attr("dy", "0.35em") 
     .text(function(d) { return d.data.age; }); 
}); 

</script> 

生成餅圖(由於csv文件和HTML文件): enter image description here

回答

0

找到的解決方案。

我剛剛使用services/search/jobs/export終點。

讓我們來看看什麼是我們的詹金斯host(其中包含在詹金斯的工作),什麼是作業的名稱(這可以解析/從日誌文件中source路徑grepped),什麼是詹金斯的工作的實際控制檯輸出( _raw)。另外,讓我們限制我們在過去30分鐘內找到數據的搜索(即earliest=-30m)。

實際上有3種方法。

1)通過在命令行傳遞用戶名和密碼。

2)通過生成一個SESSION TOKEN,我們可以在任何將來的CURL命令中使用標題。

3)通過生成一個--cookie "${COOKIE}" ID並使用它。這是一個首選的方法,因爲它將Cookie值複製到Splunk使用的任何後端服務器。要使用的Cookie名稱:splunkd_8081

最後2個解決方案取決於使用用戶憑據創建SESSION或COOKIE ID的第一種方法。


解決方案1:

1)在這裏,我們將使用我們的Splunk服務器

2)通行證用戶名和密碼在命令行

3)提供的Splunk選項查找/獲取Splunk數據(對於包含特定行的Jenkins日誌),並且執行一些額外的正則表達式匹配(這樣它就會返回確切的Jenkins構建#的源路徑,而不是爲同一控制檯顯示另外3行source輸出。 Jenkins latestBuild,latestSuccessfulBuild等是指向編號構建的符號鏈接,我們不希望在我們的輸出中列出這些符號鏈接source條目,所以我使用正則表達式模式來查找source包含NUMBERED構建的路徑#源路徑中的log文件)。

4)然後,我使用|過濾出3個字段:hostsource,並且_raw(即Splunk的將返回)。包含Jenkins服務器的Jenkins工作。 source包含詹金斯的工作名稱,建立#等信息的價值。 _raw變量包含Jenkins作業的控制檯輸出(幾行接近我們試圖在Jenkins作業的控制檯輸出中搜索的字符串/行)。

注意:所有這些3個字段都在字典變量result內可用,所以我只是輸出它。

5)然後,我以json格式提供輸出(您也可以使用csv)。最後,我使用jq來過濾掉信息。

注:如果使用jq -r ".result._raw"(即_raw領域內的字典變量result那麼它會給你的行輸出控制檯輸出(而不是給你一個斑點與\n嵌入它)LINE 你也可以使用 但是jq -r ".result._raw"很容易)。

命令跑:

$ p="$(cat ~/my_secret_password.txt)" 
$ 
$ # The above command will set my password in variable 'p' 
$ 
$ curl --connect-time 10 --max-time 900 -ks https://splunk.mycompany.com:8089/services/search/jobs/export -umy_splunk_user:$p --data search='search earliest=-30m index=some-jenkins-prod-index source=/home/jenkins/jobs/*/builds/*/log ("WARNING: " OR "npm WARN retry" OR "svn: E200033: " OR ": binary operator expected" OR ": too many arguments" OR ": No such file or directory" OR "rsync: failed to set times on") | regex source=".*/[0-9][0-9]*/log" | table host, source, _raw' -d output_mode=json | jq ".result" 
$ 
$ # The following will give you LINE by LINE output for the console output 
$ curl --connect-time 10 --max-time 900 -ks https://splunk.mycompany.com:8089/services/search/jobs/export -umy_splunk_user:$p --data search='search earliest=-30m index=some-jenkins-prod-index source=/home/jenkins/jobs/*/builds/*/log ("WARNING: " OR "npm WARN retry" OR "svn: E200033: " OR ": binary operator expected" OR ": too many arguments" OR ": No such file or directory" OR "rsync: failed to set times on") | regex source=".*/[0-9][0-9]*/log" | table host, source, _raw' -d output_mode=json | jq -r ".result._raw" 


注:的用戶ID和密碼後/需要-u和實際的Splunk用戶名之間-umy_splunk_user:$p(沒有空間傳遞


解決方案2:

解決方案編號。 2是通過使用SESSION KEY/ID。您首先必須使用services/auth/login終點。

要生成SESSION KEY/ID,請運行以下命令。

注意:爲了生成SESSION密鑰,您首先需要提供您的憑證,但是在稍後的CURL/API調用/命令中,您只需在報頭中傳遞SESSION密鑰即可。

1)生成會話密鑰/ ID。

$ p=$(cat ~/my_secret_password.txt) 
$ curl -k https://splunk.mycompany.com:8089/services/auth/login --data-urlencode username=my_splunk_userid --data-urlencode password=$p 
<response> 
    <sessionKey>192fd3e46a31246da7ea7f109e7f95fd</sessionKey> 
</response> 

2)在隨後的搜索中使用會話密鑰/ ID。

在隨後的請求中,將頭授權值設置爲會話密鑰()和,現在您不需要使用-uYourUserID:YourPassword來傳遞憑證

$ curl -k -H "Authorization: Splunk 192fd3e46a31246da7ea7f109e7f95fd" --connect-time 10 --max-time 900 https://splunk.mycompany.com:8089/services/search/jobs/export --data search='search earliest=-30m index=some-jenkins-prod-index source=/home/jenkins/jobs/*/builds/*/log ("WARNING: " OR "npm WARN retry" OR "svn: E200033: " OR ": binary operator expected" OR ": too many arguments" OR ": No such file or directory" OR "rsync: failed to set times on") | regex source=".*/[0-9][0-9]*/log" | table host, source, _raw' -d output_mode=json | jq ".result" 


注:

1),用於通過線路輸出線控制檯輸出。用途:jq -r ".result._raw"

2)對於搜索的次數發現,你可以使用| stats count

現在,我可以拿出我需要CSV或JSON格式的數據,並使用繪圖功能來顯示通過有意義的數據圖表或發送電子郵件通知,如果閾值大於或小於給定/預期值(按照我的自動化腳本)。

欲瞭解更多信息,請參閱Splunk的REST API文檔的搜索結束點這裏:http://docs.splunk.com/Documentation/Splunk/6.6.3/RESTREF/RESTsearch https://docs.splunk.com/Documentation/Splunk/6.5.3/SearchReference/SearchTimeModifiers

second: s, sec, secs, second, seconds 
minute: m, min, minute, minutes 
hour: h, hr, hrs, hour, hours 
day: d, day, days 
week: w, week, weeks 
month: mon, month, months 
quarter: q, qtr, qtrs, quarter, quarters 
year: y, yr, yrs, year, years 

如果你要搜索數據,直到最後30天,30天前到這一點,你需要有你earliest=-60d latest=-30d


解決方案3:

1)創建COOKIE ID,運行follow命令。

curl -sSv https://splunk.mycompany.com:8089/services/auth/login --data-urlencode username=your_splunk_userid --data-urlencode password=your_splunk_secret_password -o /dev/null -d cookie=1 2>&1 

它會吐出像:

< Set-Cookie: splunkd_8081=5omeJunk_ValueHere^kjadaf33999dasdx0ihe28gcEYvbP1yhTjcTjgQCRaOUhco6wwLf5YLsay_2JgZ^J^SEYF9f2nSYkyS0qbu_RE; Path=/; HttpOnly; Max-Age=28800; Expires=Wed, 20 Sep 2017 00:23:39 GMT 

現在搶值部分< Set-Cookie: <VALUE_upto_the_semi_colon>並將其存儲在一個變量。即

export COOKIE="splunkd_8081=5omeJunk_ValueHere^kjadaf33999dasdx0ihe28gcEYvbP1yhTjcTjgQCRaOUhco6wwLf5YLsay_2JgZ^J^SEYF9f2nSYkyS0qbu_RE" 

2)現在,在你的CURL命令中使用cookie來運行類似的查詢,就像我們上面做的那樣。你做不是現在需要通過憑證-uYourUserID:Password

$ curl -k --cookie "${COOKIE}" --connect-time 10 --max-time 900 ... rest of the command here similar to examples shown above ... ...