我正在學習C(和Cygwin)並試圖爲一個任務完成一個簡單的遠程執行系統。C定時器差異返回0ms
我得到一個簡單的要求就是:'客戶端將報告服務器響應每個查詢所花費的時間。'
我已經嘗試過四處搜索並實施其他工作解決方案,但總是得到0作爲結果。
什麼,我有一個片段:
#include <time.h>
for(;;)
{
//- Reset loop variables
bzero(sendline, 1024);
bzero(recvline, 1024);
printf("> ");
fgets(sendline, 1024, stdin);
//- Handle program 'quit'
sendline[strcspn(sendline, "\n")] = 0;
if (strcmp(sendline,"quit") == 0) break;
//- Process & time command
clock_t start = clock(), diff;
write(sock, sendline, strlen(sendline)+1);
read(sock, recvline, 1024);
sleep(2);
diff = clock() - start;
int msec = diff * 1000/CLOCKS_PER_SEC;
printf("%s (%d s/%d ms)\n\n", recvline, msec/1000, msec%1000);
}
我使用浮動也試過了,而不是1000分,10000乘以只是爲了看看是否有一個值的任何閃爍,但總是回到0. 很明顯,我如何實現這一點肯定是錯誤的,但是經過多次閱讀,我無法弄清楚。
- 編輯 -
值的打印輸出:
clock_t start = clock(), diff;
printf("Start time: %lld\n", (long long) start);
//process stuff
sleep(2);
printf("End time: %lld\n", (long long) clock());
diff = clock() - start;
printf("Diff time: %lld\n", (long long) diff);
printf("Clocks per sec: %d", CLOCKS_PER_SEC);
結果: 開始時間:15 結束時間:15 DIFF時間:每秒0 時鐘:1000
- FINAL WORKING CODE -
#include <sys/time.h>
//- Setup clock
struct timeval start, end;
//- Start timer
gettimeofday(&start, NULL);
//- Process command
/* Process stuff */
//- End timer
gettimeofday(&end, NULL);
//- Calculate differnce in microseconds
long int usec =
(end.tv_sec * 1000000 + end.tv_usec) -
(start.tv_sec * 1000000 + start.tv_usec);
//- Convert to milliseconds
double msec = (double)usec/1000;
//- Print result (3 decimal places)
printf("\n%s (%.3fms)\n\n", recvline, msec);
注:我已經閱讀[此問題](http://stackoverflow.com/questions/18436734/c-unix-millisecond-timer-returning-difference-of-0?rq=1)和解決方案didn沒有工作。 – Blake
什麼是'clock_t'的原始值? 'printf(「%lld \ n」,(long long)start)' – chux
問題在於你的分配問題,你將分配給一個'int',它將爲任何小於1的值賦值0,使用' float msec = float(clock() - start)/ CLOCKS_PER_SEC;'代替。 –