Wednesday, September 10, 2008

Learning in Digital Age

I enjoyed John Seely Brown's verbalization of the distinction between information and knowledge. It was short and to the point. His further analysis of knowledge into explicit and tacit components was helpful in seeing how Web 2.0 tools are facilitating the growth and transferring of knowledge.

His discussion of University education seemed pertinent as I am generating ideas for my 894 project. As I looked into programs, I was definitely concerned about job placement after the program. I wanted to a program that offered internships and project based learning, so that I graduated with tangibles to bring into a job or interview and not just a piece of paper stating that I had a degree in the field.

State University have often been a hybrid between theory and practice. Often Universities were theory based, while community colleges offered more practical offerings and supported vocational courses for the community. I wonder how learning in the digital age will shift these neat distinctions.

Technology is definitely learned best when used. You cannot read a book about a program and expect to be competent in using it. It is best when you are asked to complete a task utilizing the program, so that you are engaged in the interface and develop problem solving skills within the program when faced with relevant issues.

3 comments:

Brian said...

After listening to George Siemen's podcast I thought that "It's not about just understanding technology, it how you use it." I agree that in the digital age it's more of a connectivist thought of learning. Students might still be learning in older methods but it is how students or people in gerenal are applying their learning to others. It's how they are using it to get to what their goals are. We are learning more about how to go about finding the context before talking about a solution.

They way people learn is greatly affecting University curriculam currently and as we move forward. It will be a evolving process as Siemen's states. The one statement he makes that has stuck in my mind is "the ability to learn what we need for tomorrow is more important than what we know today." We'll see how this changes our University curriculam and how it shapes learning in the digital age.

Anonymous said...

I agree that technology is learned best when used in practice and never thought about how technology might blur the lines between the university and community college learning systems. If university curricula continue to be mainly based in theory, they may be ill suited to teach students the skills they'll need in a digital world.

Like yourself, I also found the project-based foundation of the ITEC program appealing. Creating a tangible product really helps apply the principals I'm picking up through lectures and written resources.

CJ said...

In respect to University Curriculum in Digital Age, I think that Universities will really have to comprehend that informal learning is a significant aspect of the learning experience. Harboring the communities of practice, personal networks, and work related tasks will have to be incorporated into curriculum. Universities typically have diverged from these methods and are more commonly found in community college curriculm; however because technology is such an integral part of todays learning environment it makes sense that Universities would have to integrate practical course objectives. Vocational is by definition direction toward a profession or trade which CCs have found best to instruct with practical hands on training. I see technology as a skill or trade that must be learned in conjunction with theory based knowledge in the digital age. I think curriculum divide between CCs and Universities will become strikingly similar.