英文论文写作与投稿经验上课讲义

上传人:yuzo****123 文档编号:140501531 上传时间:2020-07-30 格式:PPT 页数:51 大小:426KB
返回 下载 相关 举报
英文论文写作与投稿经验上课讲义_第1页
第1页 / 共51页
英文论文写作与投稿经验上课讲义_第2页
第2页 / 共51页
英文论文写作与投稿经验上课讲义_第3页
第3页 / 共51页
英文论文写作与投稿经验上课讲义_第4页
第4页 / 共51页
英文论文写作与投稿经验上课讲义_第5页
第5页 / 共51页
点击查看更多>>
资源描述

《英文论文写作与投稿经验上课讲义》由会员分享,可在线阅读,更多相关《英文论文写作与投稿经验上课讲义(51页珍藏版)》请在金锄头文库上搜索。

1、英文論文寫作與投稿經驗 - 黃國禎,1,英文論文寫作與投稿經驗,國立臺南大學 數位學習科技系 黃國禎 2006/3/21,英文論文寫作與投稿經驗 - 黃國禎,2,論文的種類,研討會論文 一般審查時間:2個月 錄取率:依研討會的規模決定(約50%-90%) 期刊論文 (審查時間:平均1-2.5年,錄取率低於25%) SSCI (Social Science Citation Index) SCI (Science Citation Index) EI (Engineering Index) TSSCI (Taiwan Social Science Citation Index) 其他,英文論文寫

2、作與投稿經驗 - 黃國禎,3,一般期刊之評審要點,學術價值(創新性) 應用價值 學理根據與觀點之正確性 文章組織結構 研究方法之嚴謹性 題目合宜 文章長度恰當 格式正確 用詞的正確性及文章的流暢度,英文論文寫作與投稿經驗 - 黃國禎,4,研究的觀念、態度與方法,觀念 研究是有趣的 態度 充滿好奇心、全力以卦、配合指導教授 方法 功勞苦勞創新實証,英文論文寫作與投稿經驗 - 黃國禎,6,論文架構- 以系統為主的研究,標題及作者: Title and authors 摘要: Abstract 簡介: Introduction 相關研究: Relevant Research (Literature

3、Review) 系統架構: System Structure 研究方法(演算法): XXX Approach (Algorithm) 系統製作、實驗及評量: Experiments and Evaluation 結論 參考文獻,英文論文寫作與投稿經驗 - 黃國禎,7,論文架構- 以方法為主的研究,標題及作者: Title and authors 摘要: Abstract 簡介: Introduction 相關研究: Relevant Research (Literature Review) 問題描述: Problem Definition 研究方法(演算法): XXX Approach (Al

4、gorithm) 系統製作、實驗及評量: Experiments and Evaluation 結論 參考文獻,英文論文寫作與投稿經驗 - 黃國禎,8,論文寫作的建議順序,相關研究,問題描述 或系統架構,研究方法,系統製作、實驗 及評量,簡介,摘要,結論,決定題目,英文論文寫作與投稿經驗 - 黃國禎,9,論文題目,10-15字 能立即呈現研究的目的及貢獻 Development of a Testing System (X) A New Test Sheet Generating Method (X) A Novel Approach to Composing Test sheets for

5、Multiple Assessment Criteria in Building Testing Systems (O) Development of A Testing System to meet Multiple Assessment Requirements (O),英文論文寫作與投稿經驗 - 黃國禎,10,相關研究,避免嘗試改寫中文版的相關研究,直接重寫會比較快。 先找好10-20篇最近十年相關的文獻。 挑選2-3篇最直接相關的文獻,參考其literature review的內容,來描述問題的形成動機。 再參考其他文獻的Abstract描述,依年代分段敘述最近十年的發展狀況,約100

6、0-1500字 重點:Tell a story (加一些說明將這些內容連貫起來),英文論文寫作與投稿經驗 - 黃國禎,11,In recent years, researchers have developed various computer-assisted testing systems to evaluate studentslearning status more precisely. For example, Feldman and Jones attempted to perform semi-automatic testing of student software under

7、 Unix systems 5, Rasmussen et al. exercised a system to evaluate students learning status on computer networks by taking their progress into consideration 22; furthermore, Chou proposed the CATES system 3, which is an interactive testing system developed in a collective and collaborative project wit

8、h theoretical and practical research on complex technology-dependent learning environments. Unfortunately, although many computer-assisted testing systems have been proposed, few of them have addressed the problem of finding a systematic approach for composing test sheets satisfying multiple assessm

9、ent requirements. Most of the existing systems construct a test sheet by manually or randomly selecting test items from their item banks. Such manual or random test item selection strategies are inefficient and are hardly able to meet multiple assessment requirements simultaneously.,英文論文寫作與投稿經驗 - 黃國

10、禎,12,Some previous investigations showed that a well-constructed test sheet not only helps evaluate the learning status of the students, but also facilitates the diagnosis of the problems embedded in the learning process 13, 14, 17. That is, it is very critical to select proper test items to constit

11、ute a test sheet that meets multiple assessment criteria, including the expected time needed for answering the test sheet, the number of test items, the specified distribution of course concepts to be learned, and the most important, the maximization of the average degree of discrimination 20. Since

12、 it is difficult to satisfy multiple requirements (or constraints) in selecting test items, most computerized testing systems generate test sheets in a random fashion (which will be called “random selection” throughout this paper) 16. In 15, a multiple criteria test sheet generating problem is formu

13、lated as a dynamic programming model 11 to minimize the distance between the parameters (e.g., discrimination, difficulty, etc.) of the generated test sheets and the objective values subject to the distribution of concept weights.,英文論文寫作與投稿經驗 - 黃國禎,13,Although the dynamic programming approach has ta

14、ken multiple requirements into consideration, in practical applications, more criteria need to be addressed. For example, a teacher might like to assign a range of test times instead of giving a fixed test time and usually a teacher will assign an expected lower bound for each concept weight instead

15、 of giving a distribution of concept weights. Moreover, the goal of a group test is to discriminate the status of the students. This implies that the discrimination degree of the entire test sheet needs to be maximized as much as possible. Another critical issue arising from the use of dynamic progr

16、amming approach is probably the exceedingly long execution time required for producing optimal solutions. As the time-complexity of the dynamic programming algorithm is exponential in terms of input data, the execution time will become unacceptably long if the number of candidate test items is large.,英文論文寫作與投稿經驗 - 黃國禎,14,To cope with these problems, researchers attempted to formulate a new test sheet-generating problem by optimizing the discrimination degree of the generated

展开阅读全文
相关资源
相关搜索

当前位置:首页 > 中学教育 > 教学课件 > 高中课件

电脑版 |金锄头文库版权所有
经营许可证:蜀ICP备13022795号 | 川公网安备 51140202000112号