2016-07-24 127 views
0

我正在使用SSIS(Visual Studio 2015)腳本組件(C#2015)發出Web請求。該響應提供了要下載的CSV文件的鏈接。我使用鏈接將CSV文件流式傳輸到我寫入OLEDB目標中的SQL數據庫的輸出。 CSV文件分成1000行。(SSIS)腳本組件「索引超出了數組的範圍」

我爲每個數據流運行一次該腳本,每個表運行一個數據流,總共運行12個數據流。有些工作沒有問題,但有些在加載一些CSV後會看到這個問題。

的錯誤是

at ScriptMain.CreateNewOutputRows() 
    at UserComponent.PrimeOutput(Int32 Outputs, Int32[] OutputIDs, PipelineBuffer[] Buffers, OutputNameMap OutputMap) 
    at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.PrimeOutput(Int32 outputs, Int32[] outputIDs, PipelineBuffer[] buffers) 

腳本代碼是

public override void CreateNewOutputRows() 
{ 
    /* 
     Add rows by calling the AddRow method on the member variable named "<Output Name>Buffer". 
     For example, call MyOutputBuffer.AddRow() if your output was named "MyOutput". 
    */ 
    var request = WebRequest.Create(Variables.urlparameter); 
    request.ContentType = "application/json; charset=utf-8"; 
    string responseText; 
    string[] fileArray; 
    var response = (HttpWebResponse)request.GetResponse(); 

    using (var sr = new StreamReader(response.GetResponseStream())) 
    { 
     responseText = sr.ReadToEnd(); 
     string responseCleaned = responseText.Substring(responseText.IndexOf('[') + 1, responseText.IndexOf(']') - responseText.IndexOf('[') - 1).Replace("\"", ""); 
     fileArray = responseCleaned.Split(','); 
    } 

    foreach (string file in fileArray) 
    { 
     HttpWebRequest fileReq = (HttpWebRequest)HttpWebRequest.Create(file); 
     HttpWebResponse fileResp = (HttpWebResponse)fileReq.GetResponse(); 
     using (Stream fileStream = fileResp.GetResponseStream()) 
     { 
      StreamReader reader = new StreamReader(fileStream, Encoding.UTF8); 
      string responseString = reader.ReadToEnd(); 
      string[] responseRows = responseString.Split(new string[] { "\"\r\n\"", "\"\n\"" }, StringSplitOptions.None); 
      foreach (string row in responseRows.Skip(1)) 
      { 
       Output0Buffer.AddRow(); 
       string[] responseColumns = row.Split(new string[] { "\",\"" }, StringSplitOptions.None); 
       Output0Buffer.caseid = responseColumns[0]; 
       Output0Buffer.assignedfrom = responseColumns[1]; 
       Output0Buffer.groupname = responseColumns[2]; 
       Output0Buffer.createdate = responseColumns[3]; 
       Output0Buffer.createday = responseColumns[4]; 
       Output0Buffer.audittype = responseColumns[5]; 
       Output0Buffer.minimpact = responseColumns[6]; 
       Output0Buffer.casetype = responseColumns[7]; 
       Output0Buffer.auditid = responseColumns[8]; 
       Output0Buffer.impact = responseColumns[9]; 
       Output0Buffer.cti = responseColumns[10]; 
       Output0Buffer.createhour = responseColumns[11]; 
       Output0Buffer.assignedtoindividual = responseColumns[12]; 
       Output0Buffer.closurecode = responseColumns[13]; 
       Output0Buffer.contacttime = responseColumns[14]; 
       Output0Buffer.impact12time = responseColumns[15]; 
       Output0Buffer.region = responseColumns[16]; 
       Output0Buffer.requesterlogin = responseColumns[17]; 
       Output0Buffer.resolution = responseColumns[18]; 
       Output0Buffer.resolvedby = responseColumns[19]; 
       Output0Buffer.resolveddate = responseColumns[20]; 
       Output0Buffer.rootcause = responseColumns[21]; 
       Output0Buffer.rootcausedetails = responseColumns[22]; 
       Output0Buffer.prioritylabel = responseColumns[23]; 
       Output0Buffer.ecd = responseColumns[24]; 
       Output0Buffer.dedupekey = responseColumns[25]; 
       Output0Buffer.groupmanagerlogin = responseColumns[26]; 
       Output0Buffer.assigneemanagerlogin = responseColumns[27]; 
       Output0Buffer.site = responseColumns[28]; 
       Output0Buffer.status = responseColumns[29]; 
       Output0Buffer.timespent = responseColumns[30]; 
       Output0Buffer.initialescalationtime = responseColumns[31]; 
       Output0Buffer.lastmodifieddate = responseColumns[32]; 
       Output0Buffer.totaltimespent = responseColumns[33]; 
       Output0Buffer.referenceinfo = responseColumns[34]; 
       Output0Buffer.shortdescriotion = responseColumns[35]; 
       Output0Buffer.cticategory = responseColumns[36]; 
       Output0Buffer.ctitype = responseColumns[37]; 
       Output0Buffer.ctiitem = responseColumns[38]; 
      } 
     } 
     fileReq = null; 
     fileResp = null; 
    } 
} 

而如果它是有用的,我接受和使用來調用Web請求的JSON響應:

{"files":["https://sample.url.com/raw/fetch/451cf3ecc1/raw-0.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-1.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-2.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-3.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-4.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-5.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-6.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-7.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-8.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-9.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-10.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-11.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-12.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-13.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-14.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-15.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-16.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-17.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-18.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-19.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-20.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-21.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-22.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-23.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-24.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-25.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-26.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-27.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-28.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-29.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-30.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-31.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-32.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-33.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-34.csv","https://sample.url.com/raw/fetch/451cf3ecc1/raw-35.csv"],"info":["Defaulted to 'UTC' timezone"],"warn":["Truncated 'to' - dropped hours, minutes and seconds. Was 1469121708000, Used 1469059200000"]} 
+0

在腳本中放置一個斷點並逐句通過調試器中的代碼,以獲取有關錯誤的更多信息。 –

+0

我試過了。在添加輸出列時,它會在插入幾百行後生成錯誤。我無法從中找到有用的東西。從我可以收集的信息中可以看出,當你嘗試創建一個不存在的輸出時,會發生這種情況,所以很奇怪的是它在多次使用輸出之後會這樣做。 在附註中,手錶在SSIS腳本組件中不起作用。當我將它作爲自己的VS包運行時,它運行良好,但它也不會生成輸出。 –

+0

那好吧,錯誤發生在哪行代碼?錯誤發生時索引的值是多少?此時數組的範圍是多少? –

回答

0

原來的問題是,CSV行分隔符是\「\ r \ n \」,其中一列的值是\「\ r \ n \」。我將它更新爲解決問題的[!,] \「\ r \ n \」。

索引錯誤是由於拆分未填充數組而導致的結果。感謝您的關注,抱歉發佈的問題實際上只是我的錯誤。