Integrating Soft Skills Assessment Through University, College, and Programmatic Efforts at an AACSB Accredited Institution
By Beard, Debbie Schwieger, Dana; Surendran, Ken
ABSTRACT The growing demand for verification that students are, indeed, learning what they need to learn is driving institutions and programs to develop tools for assessing the level of knowledge and skills of their graduating students. One such tool, the Information Systems Analyst (ISA) certification, is a recently developed instrument for measuring eight skill areas based upon the IS2002 Model Curriculum. While the exam strongly evaluates the technical skill set of Information Systems (IS) majors, in this paper, the authors suggest additional means of addressing and measuring requisite soft skills for Information Technology (IT), accounting, and other business students. In this article, the authors address the concerns voiced by the employers of college graduates regarding the apparent insufficient competency in soft skills and suggest an assurance of learning model for incorporating these skills into curricula. In addition, the authors share activities at the university, college, and program level to integrate the assessment of soft skills at educational institutions.
Keywords: Soft skills, Assurance of learning, Soft skills assessment
The study of information systems in a university environment contains a complex combination of technical, business, organization, and interpersonal skill requirements. A process for demonstrating success in building those skill sets has been pursued as educators respond to calls from professional organizations, accrediting agencies, legislators, and others to demonstrate accountability. In addition, our quest for assurance of learning and continuous improvement requires benchmarks, data collection and analysis, and feedback that can highlight demonstrated competencies, actions that should be taken, and the consequences of actions taken. Explicitly setting goals and objectives relating to soft skills in our strategic planning, curriculum development, and pedagogy is important.
Some of the institutions involved in identifying competencies, establishing standards, providing guidance and developing assessment tests include the Association for Computing Machinery (ACM), Association of Information Technology Professionals (AITP), the Association for Information Systems (AIS), the American Accounting Association (AAA), the American Institute of Certified Public Accountants (AICPA), the International Federation of Accountants (IFAC), and the American Association to Advance Collegiate Schools of Business (AACSB), to name a few. Assessment plans, strategies, tools, tests and programs are being developed to generate data for assessing learning outcomes. Examinations, case studies, internship observations, projects and work portfolios are some of the many methods available for assessing student knowledge, skills, and capabilities.
In this paper, the authors first review relevant literature relating to soft skills and educational assessment issues, including the assessment concerns and academic requirements of the Assurance of Learning Standards emphasized by the AACSB and guidance provided by various professional organizations and individuals in designing and implementing assessment. Subsequent sections provide a proposed model for assurance of learning and examples of university, college, and programmatic activities relating to assessment.
2. RELATED LITERATURE: SOFT SKILLS
Reports from various professional organizations and individuals have examined the changing demands of accounting and information technology professionals (AECC, 1990; Albrecht & Sack, 2000; AAA, 1998; AICPA, 1998; Arthur Andersen & Co., 1989; Cheney, Hale, and Kasper, 1990; Gallivan, Truex, Kyasn, 2004; IMA, 1999; Lee, Trauth, Farewell, 1995; Misic 1996; Robert Half Intl., Inc., 2006; Segars & Hendrickson, 2000; Todd, McKeen & Gallupe, 1995; Wade & Parent, 2001/ 2002; Wynekoop & Walz, 2000). Increased emphasis on “soft skills” or nontechnical skills was a consistent conclusion from our review of the literature.
Gallivan et al., (2004) identified the six most common non- technical skills mentioned in employment advertisements as 1) communication, 2) interpersonal, 3) leadership, 4) organization, 5) self-motivation, and 6) creativity. Of the total skills mentioned in online job advertisements, non-technical skills represented 26 percent. Future employees will need to be
“flexible-to fit where they’re needed, rise to new levels of expectation and transition into areas in which they can contribute and continue to learn. They will interact with individuals at all levels of an organization and, therefore, work with and motivate people who have a variety of professional strengths, skills, and areas of interest. Leadership abilities, team-player skills, and project-management expertise will be essential. . . .written and verbal communication ability, professional poise and strengths in motivating, working with and leading others will gain new importance. . . interpersonal skills and the ability to conceptualize solutions and explain them to clients and employers are extremely important” (Robert Half Intl., Inc, 2006).
Thacker and Yost (2002) noted that students need to be trained to be effective team members as employers often find that graduates lack good team leadership skills. The ability to work with others and communicate ideas, in both verbal and written format, is critical to the future employee.
The Job Outlook 2008 Survey of 276 employers (Koncz & Collins, 2007) examined the qualities that employers look for in prospective employees. Data was collected using a five-point scale ranging from 1 to 5 with “1″ indicating that the characteristic was “not important” and “5″ indicating that the characteristic was “extremely important.” The levels of importance for the characteristics studied are as follows:
Respondents overwhelmingly responded that the key skills that were most lacking in new college graduate candidates were verbal and written communication skills (Koncz & Collins, 2007). Thus, employers are reporting that skills that we typically view as “soft skills” in accounting, MIS, and CS are, indeed, extremely important. Deficiencies in these skill sets are not only a concern for potential employers and accrediting bodies such as the Association to Advance Collegiate Schools of Business (AACSB), but present challenges for IT and accounting practitioners and educators.
3. RELATED LITERATURE: ASSESSMENT
The increasing interest in accountability among the general paying public and educational institutions has driven field organizations, accrediting institutions, and finally, colleges and universities to begin developing methodologies for assessing student learning outcomes (Black & Duhon, 2003; Landry, Longenecker, Pardue, McKell, Reynolds, & White, 2006; McGinnis & Slauson, 2003; Martell, 2007; McKell, Reynolds, Longenecker, Landry, & Pardue, 2006; Reynolds, Longenecker, Landry, Pardue, & Applegate, 2004). To address concerns regarding information systems education, ACM, AITP, and AIS have jointly sponsored the development and revision of model curriculum guidelines (McKell, et. al., 2006). In the wake of recent concerns regarding educational outcomes across the college curriculum, further steps were taken by the Institute for Certification of Computing Professionals (ICCP) and the Center for Computing Education Research (CCER) by formulating a team of 40 universities and colleges to develop an assessment tool “to assess the knowledge and practical readiness of IS students and professionals and to evaluate, improve, and accredit undergraduate information systems degree programs” (Reynolds, et. al., 2004: 4). This assessment tool was developed with consideration for the IS 2002 Model Curriculum as well as criteria listed in IS entrylevel position advertisements (McKell, et. al., 2006).
The resulting product of the coalition’s efforts provided an assessment tool with a three-pronged focus. The section examining information technology skills included questions focusing upon software development, web development, databases, and systems integration skills. The portion of the examination focusing upon organizational and professional skills tested individual and team interpersonal skills as well as business fundamentals. The section providing questions related to strategic organizational systems development using IS, focused upon organizational systems development and project management skills (Reynolds, et. al., 2004).
The AICPA provides the Educational Competency Assessment (ECA) website to help educators integrate the skills based competencies needed by entry-level accounting professionals. The competencies, defined within the AICPA Core Competency Framework Project have been derived from academic and professional competency models and have been widely endorsed within the academic community (http:// www.aicpa-eca.org).
The International Federation of Accountants (IFAC) has offered international education guidelines for professional accountants in assessment. The draft guidelines consider the key concepts in assessment, provide a summarized evaluation of relevant assessment methods, and consider which assessment methods are best suited to test different capabilities and competencies (http://www.ifac.org). Gainen and Locatelli (1995) provided background on the assessment movement in the U.S., outlined a model for developing as assessment program, provided guidance for faculty to use in assessment, and illustrated the use of assessment as a tool for continuous improvement of learning outcomes and client satisfaction. Demong, et al., (1994) examined various issues involved in designing an assessment program, identifying various assessment methodologies that could be used in assessing accounting programs. Likewise, Akers, et al., (1997) noting that published research on assessment methods currently in use in accounting education was limited, focused not only on the design but also the implementation of an assessment program at their institution. They outlined how their assessment committee and faculty developed six intended student outcomes, established quantifiable goals, developed measurement tools to evaluate the goals, and identified mechanisms to provide feedback for continuous improvement.
Aasheim, et al., (2007) described the assessment process designed and implemented for an information technology (IT) program with specific emphasis on courselevel assessment. Several examples of course-level assessments were provided. White and McCarthy (2007) discussed the use of the Center for Computing Education Research (CCER) IS Assessment Test in the development and implementation of a comprehensive assessment plan on their campus. Paranto and Shillington (2006) addressed issues relating to using a well- designed multiple-choice test that can be used as a placement tool when logistics and other factors limit the use of technology in assessing student skills. Stemler and Chamblin (2006) shared their experiences and outlined procedures for developing an assessment strategy to achieve accreditation and to improve their MIS program. Todorova and Mills (2007) recommended a four stage approach to the evaluation and development of assessment portfolios for IS education that utilize diverse methods for assessment.
Other oversight and accrediting bodies have been encouraged to develop similar assessment concerns for their fields of study (Black & Duhon, 2003). The AACSB is one such institution that has also developed new accreditation standards focusing upon assurance of learning (Black & Duhon, 2003). The intent of the Assurance of Learning Standards emphasized by the AACSB, is to “…evaluate how well the school accomplishes the educational aims at the core of its activities” (AACSB, 2006).
Martell (2007) stressed the change in focus of the revised AACSB standards on assurance of learning and provided examples on how assessment results can be used to improve curricula. She also provided insight into problems some schools have in meeting the assurance of learning standards. Pringle and Mitri (2007) reported the continued use of indirect measures, such as surveys, the time involved in assessment, and some results assessment yields in their findings from a survey of 138 AACSB-accredited schools.
According to AACSB Accreditation Standard 15, although universities do not need to provide specific courses addressing the following undergraduate skills, programs need to provide learning experiences addressing both general and management-specific learning goals including (AACSB, 2006).
* Communication abilities,
* Ethical understanding and reasoning abilities,
* Analytic skills,
* Use of information technology,
* Multicultural and diversity understanding,
* Reflective thinking skills.
These goals need to be routinely assessed and systematically evaluated with the analyzed results of the findings distributed to faculty to assist them with continual adjustments to their course curriculum.
Thus, while faculty at AACSB seeking institutions prepare their course objectives to adequately cover discipline specific technical skills such as those advocated in the ACM IS2002 Model Curriculum Guidelines, ISA certification examination, or the AICPA Core Competency Framework, they must also address the assessment skill requirements established by accrediting bodies such as AACSB as well as incorporate material addressing the growing demands of future employers for increased competence in the nontechnical skills of their potential employment prospects.
4. A MODEL FOR AN ASSESSMENT PROCESS
Curricula models and assessment activities are important components of the continuous improvement process in education (McGinnis & Slauson, 2003). Program assessment involves setting goals and objectives for the program, undertaking activities that measure success in reaching those goals and objectives, and then implementing necessary changes to improve program quality.
The integration of “soft skills” into the curricula to address the requirements of employers and engage today’s learner in the learning process should proceed in a strategic and well-organized manner as modeled in Figure 1. Identification and specification of the goals relating to requisite “soft skills” requirements should be addressed and learning objectives developed.
Strategies for achieving these objectives, including various pedagogy-related activities and methods focusing upon presentation and practice, should be formulated and implemented. This step may require refinement of the goals and adjustment of the objectives if effective methods and activities cannot be identified or integrated into the curricula in a beneficial and effective manner. Regular measurements of knowledge, skills, and attitudes should be taken to determine student competencies and resultant compiled outcomes of the activities and methods. Evaluation of student performance associated with the measures and outcomes, activities and methods, and goals and objectives of the learning process should be developed, reported, and utilized to suggest appropriate actions to be taken to demonstrate a focus on continual quality improvement.
5. UNIVERSITY EFFORTS TO DEVELOP AND ASSESS SOFT SKILLS
At our university, several initiatives have been taken to develop and strengthen “soft skills” of all our students and to provide a formal, on-going, approach to assessment. For example, our general education curriculum is built on demonstrating competency on nine objectives which include the following:
1. To demonstrate the ability to locate and gather information
2. To demonstrate capabilities for critical thinking, reasoning, and analyzing skills
3. To demonstrate effective communication skills
4. To demonstrate an understanding of human experiences and the ability to relate them to the past
5. To demonstrate an understanding of human experiences and their interrelationships
6. To demonstrate the ability to integrate the breadth and depth of knowledge and experience
7. To demonstrate the ability to make informed, intelligent value decisions
8. To demonstrate the ability to make informed, sensitive aesthetic responses
9. To demonstrate the ability to function responsibly in one’s natural, social, and political environment
A formal assessment plan has been developed and implemented for the Program. The development of soft skills has been integrated into classroom activities and assessments of student performance into an array of activities including case studies, special projects, group work, and oral and written presentations. Several years ago, our institution implemented a writing outcomes assessment program to test and possibly correct the written communication skills of upperclassmen before they graduate. All students must successfully take the examination upon completion of 75 semester hours at our institution. Successful performance on the examination is required for graduation.
Assessment of the University Studies Program and Writing Outcomes as well as other programs throughout our university has been formalized into strategic planning and operational decisions. A University Assessment Committee was established several years ago to be proactive in ensuring that departments and programs conduct on- going, selffocused, results-based assessments. This committee has served as both a resource for best practices in assessment and an oversight board for evaluating program assessment plans and processes. The Committee evaluates annual assessment reports from various program administrators and shares the results with the University community through the Provost’s website. Trends in the data are compiled over time and are also made available for public review. The Committee developed a guiding document entitled “The 15 Principles of Assessment” to guide the assessment process advocating that:
1. Assessment should be a systematic, ongoing process that involves gathering, interpreting, and using information for continuous improvement.
2. Assessment should focus on specific programs and activities that contribute to the intellectual, professional, personal, and cultural needs of students.
3. Assessment should be shaped and guided by faculty, students, and staff, with administration and administrative processes providing essential support.
4. Assessment should flow from the institutional mission, and the institution’s mission should be shaped by the results of assessment when appropriate.
5. Assessment outcomes should be used in planning, budgeting, and allocating resources.
6. Flexibility in the choice of assessment procedures should be encouraged, permitting the exercise of professional judgment as to the appropriate methods of assessment.
7. Assessment should be based on multiple measures, both quantitative and qualitative, including, for example, locally developed instruments, surveys, nationally-normed exams, external reviews, exit interviews, historical data, and evaluation of performances. 8. The use of assessment results should determine the choice of assessment procedures.
9. Assessment should be cost-effective.
10. Assessment procedures should be regularly evaluated as to their usefulness for fostering continuous program improvements.
11. While assessment for accountability may be necessary, it should be integrated as far as possible into assessment for improvement.
12. Assessment activities should be minimally intrusive on faculty, students, and staff.
13. Assessment plans and activities should be continuously evaluated and improved through peer review and discussion.
14. There should be regular comprehensive reviews of the assessment plan.
15. Assessment and the use of assessment results should not unfairly restrict institutional goals of diversity and access.
More information concerning assessment at our institution can be found at http://www2.semo.edu/provost/uarc2.
Another campus-wide initiative taken at our institution that has had implications for soft skills development and assessment has been focused on experiential learning. Again, a predominantly decentralized approach has been taken, yet supported. For example, each department has an internship coordinator. Campus programming provided by the Department of Career Linkages and sponsored by the Division of Student Affairs has also emphasized and supported experiential learning.
6. COLLEGE EFFORTS TO DEVELOP AND ASSESS SOFT SKILLS
As an AACSB accredited institution, changes in AACSB standards have served as a catalyst for formalizing our assessment activities in the College of Business. Working through the Dean’s Office in the College of Business, faculty and staff have provided input to the College Assurance of Learning Committee. The Committee was charged with outlining the assurance of learning process built around the learning goals and objectives adopted by our faculty and identifying courses where assessments would be made. Table 2 was a product of the College of Business’ Committee’s work and serves as an example of a planning rubric that could be used to fulfill the steps in the process of providing assurance of learning in college curricula.
The matrix will vary for other universities and programs based upon differences in mission. The first column, in the matrix corresponds with the critical general business soft skills deficiency identified by AACSB (AACSB, 2002). The “Objectives” column indicates the desired outcome for each goal. The “Measurements” column illustrates some possible general tools that can be used to work toward the goals and achieve the objectives. The “Evaluation” column describes how the measurement activities will be observed in order to determine level of performance. Based upon observations of the evaluation results for the measurement instruments used to examine performance of the objectives, corrective actions may need to be taken in order to continue to improve the quality of university graduates. The last column will be used to record actions that can be taken to correct or improve the assurance of learning process.
The Assurance of Learning Matrix (Table 2) is currently in its first stages of use. As we continue the assessment process, assessment goals and tools will be added and/or modified. The matrix can be used not only as a guide to identify goals, objectives, and measurements of learning outcomes, but also as a means for communicating the evaluation of performance and actions to be taken. This important process can assist educators in matching strategies, activities, and course materials that meet the needs and objectives of future employers and fields of study.
Because proficiency in oral and written communication and application of critical thinking skills to business problems and ethical decisions were identified as goals for our Bachelor of Science in Business Administration degree, as the matrix indicates, rubrics were designed for evaluating written communications skills, ethical reasoning, and oral communication skills. Data collected from using these evaluations can then be used for benchmarking and analysis. Some of the assessment rubrics currently in use at our university are provided in the following tables:
* Table 3 – The College Writing Evaluation Form;
* Table 4 – The Evaluation Form for Ethical Reasoning in Business;
* Table 5 – The College Presentation Evaluation Form.
7. DEPARTMENT AND PROGRAM EFFORTS TO DEVELOP AND ASSESS SOFT SKILLS
The demonstration of soft skills by our graduates has also been identified as important goals and objectives for our programs in accounting, management information systems, and computer science. At our institution, the responsibility for administration of assessment is decentralized. In general, the units closest to the delivery of programs have primary responsibility for design, implementation, and use of assessments. However, an annual assessment report is prepared, submitted to the Dean of the College and University Vice- Provost, and evaluated by the University Assessment Review Committee. The report must match each assessment method with specific goals and objectives. A rationale for using each method is required. Data collected and analyses are also included in the report as appropriate. Conclusions and responses taken or planned as a result of assessment are disclosed. Being able to demonstrate a closing of the assessment loop has become increasingly important.
A diverse array of tools and activities is available for developing and assessing the knowledge and skills of our students, including their soft skills. Some assessment tools and activities available include: comprehensive exit examinations or exit interviews; class projects; portfolios, surveys of students, alumni, and employers; pre-test/posttests; pass rates on professional certification examinations and other nationally-normed, standardized tests; scores on locally-developed achievement tests; and career placement rates. At our institution, a variety of methods are used in assessing soft skills.
7.1 Experiential Learning Projects
For IT, accounting, and other business programs at our university, internships and other experiential learning opportunities have been a valuable source of such data. Through the efforts of the internship coordinators, faculty, and professional staff, internships and other experiential learning opportunities have been integrated into our curricula as part of either required or elective courses. These efforts have provided students with opportunities to not only further their technical skill set, but also to enhance their “softer side.” Examples of experiential learning and the learning by doing method for developing soft skills are provided in this section.
The first example involves a combined class of Management Information Systems (MIS) and Applied Computer Science (ACS) students. In 2002, the analysis and design course focused upon a group assignment in which teams of four or five students designed and developed a four phased project over the course of the semester. Team selection was left to the students with a condition that each team should have at least one ACS and one MIS student. The creation of mixed major groups not only provided the students with a realistic work environment, but it also helped the students to learn how to interact with team members having differing perspectives and skills set. During the course of the project, each student was required to assume at least three different team member roles which included: business analyst, systems analyst, process designer, database designer, interface designer, programmer, team coordinator, and researcher. Role-playing activities, like these, are important since they offer the students greater opportunities to apply and develop their soft skills. Every role that the student could assume required at least one, if not more, soft skills in order to carry out the assignment properly.
At the end of the course, the students were asked to complete a questionnaire that, among other things, addressed soft skills development that arose from working in mixed major groups. The soft- skills considered were interpersonal, communication, team building, planning, and leadership. The students indicated their perception concerning soft-skills development on a 5-point progressive scale. The survey was conducted in Spring 2002 and Fall 2002 and was given to a combined student group total of 49 students. A total of 34 students responded (21 MIS majors and 13 ACS majors) for a response rate of 69.4 percent. Their combined response concerning soft skills is indicated in Table 6. For complete results that include technical skills, see (Surendran, et. al., 2005).
The objective in presenting the perceptions of the two classes separately (Table 6) is not intended for comparing the two groups but to emphasize that both the groups were benefited by this teaching approach. The combined course offering helped both majors in achieving above average soft skills development.
After completing the systems analysis and design course, the students were required to carry out, as a group assignment, a client sponsored system development project in their respective capstone experience (MIS or ACS) course. In these courses, they interacted with clients and produced a working prototype system to meet the clients’ needs. The students were required to make four in-class presentations and a final project presentation to all project clients and the academic board members of the department. Over the past three years, fifteen projects have been completed. One of the evaluative items for the project is the quality of documentation (both system and user). The average rating on a five point scale for the quality of documentation for the fifteen projects was 3.82.
7.1.3 Internship Skill Assessments
In our accounting and MIS programs, each intern completes a weekly diary/journal, a final paper, and an oral presentation. In addition, students completing an internship are required to complete a self-assessment survey. Students are asked to evaluate themselves on several traits with “5″ indicating “Outstanding” and “1″ being “Poor.” Students are also asked to circle those traits in which they think they have improved significantly during the internship. The results of self-assessments for two years are shown in Table 7. Those items marked with an “*” were noted by 3 or more student interns as having improved significantly as a result of the internship: At the end of each accounting internship, a supervisor’s evaluation form is completed. This evaluation, which is shared with the intern, provides an opportunity to reflect on the student’s attitude, initiative, dependability, maturity, judgment, ability to learn, quality and quantity of work, relation with others, attendance and punctuality. The results from these evaluations are shared with the intern and are reviewed by the internship coordinator and are available to be included in the Department Assessment Report.
In addition to the above evaluation by the supervisor, a College of Business Internship Survey is mailed each semester to supervisors for that semester’s internships. The survey relates to specific goals and objectives established by the Department and College. Respondents are asked to circle the appropriate numbers from “1″ (not at all) to “7″ (a great deal) on questions related to the intern’s display of appropriate communications skills, problem- solving skills, teamwork skills, leadership skills, microcomputer applications skills; understanding of general current business issues, accounting, economics, finance, management, marketing, and office systems; and the intern’s overall preparation for the internship. Results from a recent three-year period are shown in Table 8.
Times series data, like that in Table 8, have been tabulated and reviewed along with supervisor comments during the assessment process. Faculty and administrators have reviewed and analyzed the trends and discussed actions to be taken for improvement. Due at least in part to the results received from our internship assessment tools, faculty have integrated more activities focusing on oral and written communication skills, computer applications, problemsolving, teamwork, and leadership into our courses and cocurricular activities in recent years. In addition, students have been made aware of the importance of being punctual, dependable, appropriately dressed and groomed, and self-confident and have been encouraged to take initiative, manage their time effectively, and accept criticism. Responses have also provided valuable feedback to improving the internship program and the overall program in the accounting program.
The educational needs of the typical college student and the corporate environment are continuously changing. Concerns regarding the adequacy of the preparation of graduates for success in their fields are arising both in and outside the university environment. Thus, university accounting, IT, and other business educators need to stay up-to-date on desired and expected knowledge, skills, and attitudes of various careers paths of their graduates. Developing a means of identifying and assessing skills and knowledge as well as providing a process for acting on those assessments is critical to providing curricula and pedagogy that promotes continuous improvement and demonstrates accountability.
Assessment should be consistent with the mission of the university, college, program, and fields of study and should be integrated into accountability and continuous improvement of learning and teaching. Goals, objectives, and standards for student performance should be identifiable, measurable, minimally intrusive, and cost effective. Multiple measures should be chosen in demonstrating the assurance of learning. Experiential learning opportunities involving collaborative projects connecting leading- edge practitioners, professional organizations, student groups, and/ or colleagues, both within and outside specific disciplines, not only increase the depth of the academic experience, but are also invaluable opportunities to develop and demonstrate soft skills.
Future activities and methods should tap the tools that are increasingly used by today’s students. The use of common social networking tools should be incorporated from an academic perspective to enhance communication skills and connect students to the learning process. Blogs could be used to enhance written communication through the development of streams of thought or as tools for providing student-directed project assistance. Second Life and social utilities can also be utilized in group projects for virtual meetings or to bring together students having common career aspirations.
AACSB Accreditation Standards: Assurance of Learning, Retrieved October 11, 2006, from http://www.aacsb.edu/resource_centers/ assessment/std-intent.asp
AACSB, (1995-1996), A Report of the AACSB Faculty Leadership Task Force. AACSB International, St. Louis, MO, pp. 5 -6.
Aasheim, C, Gowan, J. and Reichgelt, H. (2007), “Establishing an Assessment Process for a Computing Program,” Information Systems Education Journal. Vol. 5, No. 1, Retrieved January 4, 2008, from http://isedj.Org/5/l
Accounting Education Change Commission (AECC). (1990) Objectives of Education for Accountants: Position Statement Number One. Issues in Accounting Education, (Fall) pp. 307-312.
Akers, M., Giacomino, D. and Trebby, J. (1997), Designing and Implementing an Accounting Assessment Program, Issues in Accounting Education, (Fall) pp. 259-280.
Albrecht, W. and Sack, R. (2000), Accounting Education: Charting the Course through a Perilous Future, Accounting Education Series, American Accounting Association, Sarasota, FL, pp. 1- 72.
American Accounting Association (AAA). Report of the Changing Environment Committee. (1998), The Future Viability of Accounting Education, Sarasota, FL, p. 111.
American Institute of Certified Public Accountants (1998), CPA Vision Project: Focus on the Horizon, Executive Summary and CPA Vision Project Focus Groups: Public Practice, Industry, and Government CPAs, New York, NY, pp. 13-31.
Arthur Andersen & Co., Arthur Young, Coopers & Lybrand, Deloitte Haskins & Sells, Ernst & Whinney, Peat Marwick Main & Co., Price Waterhouse, and Touche Ross (1989), Perspectives on Education: Capabilities for Success in the Accounting Profession (The White Paper), New York, NY, pp. 168-195.
Black T. H. and Duhon, D. L. (2003), “Evaluating and Improving Student Achievement in Business Programs: The Effective Use of Standardized Assessment Tests,” Journal of Education for Business. Vol. 79, No. 2, pp. 9098.
Cheney, P., Hale, D. and Kasper, G. (1990), “Knowledge, Skills, and Abilities of Information Systems Professionals: Past, Present, and Future,” Information & Management. Vol. 19, No. 4, pp. 237-247.
Demong, R., Lindgren, J. and Perry, S. (1994), “Designing an Assessment Program for Accounting,” Issues in Accounting Education. (Spring) pp. 11-27.
Gallivan, M., Truex III, D. and Kvasny L. (2004), “Changing Patterns in IT Skill Sets 1998-2003: A Content Analysis of Classified Advertising,” Database for Advances in Information Systems. Vol. 35, No. 3, pp. 64-86.
Gainen, J. and Locatelli, P. (1995), Assessment for the New Curriculum: A Guide for Professional Accounting Program, American Accounting Association and Accounting Education Change Commission, Sarasota, FL,
Institute of Management Accountants (IMA) (1999), Counting More, Counting Less: Transformation in the Management Accounting Profession, Executive Summary, Institute of Management Accountants, Montvale, NJ. pp. 3-18.
Koncz, A. and Collins, M. (2007), Retrieved January 9, 2008, “News for Media Professionals,” NACEWeb, from http:// www.naceweb.org/press/display.asp?year=2007&prid=270.
Landry, J. P., Longenecker, H. E., McKeIl, L., Pardue, J. H. Reynolds, J. H., and White, B., (2006), Retrieved January 4, 2008, “Using the Model Curriculum and CCER Exit Assessment Tools for Course-level Assessment,” Information Systems Education Journal. Vol. 4, No. 73, from http://isedj.Org/4/73
Lee, D., Trauth, E. and Farwell, D. (1995), “Critical Skills and Knowledge Requirements of IS Professionals: A Joint Academic/ Industry Investigation,” MIS Quarterly. Vol. 19, No. 3, pp. 313- 332.
McKell, L. J., Longenecker, H. E., Reynolds, J., Landry, J. P., and Pardue, H. (2006), “The Center for Computing Education Research (CCER): A Nexus for IS Institutional and Individual Assessment,” Information Systems Education Journal. Vol. 4, No. 73, Retrieved January 4, 2008, from http://isedj.Org/4/73
McGinnis, D. R. and Slauson, G. J. (2003), “Advancing Local Degree Programs Using the IS Model Curriculum,” Information Systems Education Journal, Vol. 1, No. 37, Retrieved January 4, 2008, from http://isedj.Org//37
Martell, K. (2007), “Assessing Student Learning: Are Business Schools Making the Grade?” Journal of Education for Business. Vol. 82, No. 4, pp. 189-195.
Misic, M. (1996), “The Skills Needed for Today’s Systems Analysts,” Journal of Systems Management. Vol. 47, No. 3, May- Junel996, pp. 34-40.
Paranto, S. and Shillington, L. (2006), “Is it Possible to Assess Information Systems Skills using a Multiple Choice Exam?” Information Systems Education Journal. Vol. 4, No. 24, Retrieved January 4, 2008, from http://isedj.Org/4/24
Pringle, C. and Mitri, M. (2007), “Assessment Practices in AACSB- Accredited Business Schools,” Journal of Education for Business. Vol. 82, No. 4, pp. 202-211.
Reynolds, J. H., Longenecker, H. E., Landry, J. P., Pardue, J. H. and Applegate, B. (2004), “Information Systems National Assessment Update: The Results of a Beta Test of a New Information Systems Exit Exam Based on the IS 2002 Model Curriculum,” Information Systems Education Journal. Vol. 2, No. 24, Retrieved January 4, 2008, from http://isedj.Org/2/24 Robert Half International Inc. (2006), Next Generation Accountant: New Competencies, Converging Disciplines, and Expanding Roles, Menlo Park, CA, pp. 4-18.
Segars, A. and Hendrickson, A. (2000), “Value, Knowledge, and the Human Equation: Evolution of the Information Technology Function in Modern Organizations,” Journal of Labor Research. Vol. 21, No. 3, pp. 431-445.
Stemler, L. and Chamblin, C. (2006), “The Role of Assessment in Accreditation: A Case Study for an MIS Department,” Information Systems Education Journal. Vol. 4, No. 39, Retrieved January 4, 2008, from http://isedj.Org/4/39
Surendran, K., Ehie, I. C. and Somarajan, C. (2005), “Enhancing Student Learning across Disciplines: A Case Example Using a Systems Analysis and Design Course for MIS and ACS Majors,” Journal of Information Technology Education. Vol. 4, pp. 257-274.
Thacker, R. A. and Yost, C. A. (2002), “Training students to become effective workplace team leaders,” Team Performance Management. Vol. 8, No. 3/4, pp. 89-94.
Todd, P., McKeen, J. and Gallupe R. (1995), “The Evolution of IS Job Skills: A Content Analysis of IS Job Ads,” MIS Quarterly. Vol. 19, No. 1, pp. 1-37.
Todoroa, N. and Mills, A. (2007), “The Development of Assessment Portfolios for IS Majors,” Information Systems Education Journal. Vol. 5, No. 25, Retrieved January 4, 2008, from http://isedj.Org/5/ 25
Wade, M. and Parent, M. (2001/2002), “Relationships Between Job Skills and Performance: A Study of Webmasters,” Journal of Management Information Systems. Vol. 18, No. 3, pp. 71-96.
White, B. and McCarthy, R. (2007), “The Development of a Comprehensive Assessment Plan: One Campus’ Experience,” Information Systems Education Journal. Vol. 5, No. 35, Retrieved January 4, 2008, from http://isedj.Org/5/35
Wynekoop, J. and Walz, D. (2000), “Investigating Traits of Top Performing Software Developers,” Information Technology & People. Vol. 13, No. 3, pp. 186-202.
Department of Accounting and MIS
Southeast Missouri State University
One University Plaza MS 5815
Cape Girardeau, MO 63701 USA
Ken Sur en dr an
Department of Computer Science
Southeast Missouri State University
One University Plaza MS 5950
Cape Girardeau, MO 63701 USA
Debbie Beard is an associate professor of Accounting in the Donald L. Harrison College of Business at Southeast Missouri State University in Cape Girardeau. She holds a PhD in Business Administration with a major in accounting from the University of Arkansas – Fayetteville. She is a Certified Management Accountant. Her current research interests include experiential learning, information security, accounting and MIS curricula restructuring, financial literacy, and ethical codes of conduct.
Dana Schwieger is an associate professor of Management Information Systems in the Donald L. Harrison College of Business at Southeast Missouri State University in Cape Girardeau. She holds a PhD in Management Information Systems from Southern Illinois University – Carbondale. She was selected for the Boeing Welliver Faculty Fellowship. Her current research interests include adaptive structuration theory, health management information systems, MIS curricula restructuring and e-business strategy.
Ken Surendran is a professor in the Department of Computer Science at Southeast Missouri State University. His research interests include software engineering, security management education and object technology. He has a BE in electrical engineering from the university of Madras, India, an M. Tech in electrical engineering from the Indian Institute of Technology and a PhD in applied analysis from the State University of New York at Stony Brook. He is a senior member of the IEEE and a member of the ACM.
Copyright EDSIG Summer 2008
(c) 2008 Journal of Information Systems Education. Provided by ProQuest LLC. All rights Reserved.