2/23/2011

Managing Google Calendars with Multiple gApps Domains

I've been a happy user of Google Apps standard edition since it was first launched. Tied to my personal domain, the suite of email, calendar, docs, etc. have bee worth far more than their (Free!) price would suggest.  Above, all gCal has kept our hectic family life as close to coordinated as is possible.  Each family member has an account. Family rule is that all events must go on your gCal.  gCal is configured so that everyone sees everyone else's calendar.  Scheduling family events is much easier. 

Google Calendar Sync has further simplified keeping calendars up to date. gCal sync automatically and seamlessly allowed my work Outlook calendar, on Exchange, to sync events both ways with my Google calendar. Ditto for Susan.  With gCal Sync it doesn't matter whether I use the Outlook calendar or the Google calendar.  Sync is configured to sync changes both ways.  This capability was especially valuable when I had a Treo 700P and then a Samsung Saga smartphone. Both connected via the Exchange server.  Seamless, as long as gSync was running.

Last week, ONU transitioned -- finally! -- to Google Apps for Education. (In celebration I retired my WinMo phone for a Droid X ... but that's another story.) So, now I have two Google Apps accounts: a personal account and a work account.  Work-Life separation is good. Right?  Maybe.

I've discovered that there is a big difference in gCal calendar sharing within an Apps Domain vs. across Apps Domains.  Within an Apps Domain, one can share a calendar and allow the other person the ability to modify entries.  Within an Apps Domain, one can see all details on shared calendar events.  Sharing calendars across domains allows, apparently, only sharing of free/busy status. That's helpful, but if I'm sharing my work calendar with family members, I'd like for them to know whether that blob on the calendar is a class session or office hours, for example.  

What to do?

My (non-optimal) work around is to create an event to my work calendar. When creating the event I invite myself using my personal domain email address. This has the effect of getting the event on both calendars in a way that can be seen by all family members on my private domain gCal.

Ideal? Nope. But servicable until Google modifies gCal to expand calendar sharing abilities across Google Apps domains.
  

2/13/2011

Drifting Through Academically Adrift




Arum and Roska's book Academically Adrift: Limited Learning on College CampusesDescription: http://www.assoc-amazon.com/e/ir?t=gentleeyeimagery&l=as2&o=1&a=B004LE9ILS has created quite a stir lately. The book reports a study of student learning in college. The data are from students that have completed their sophomore year at a small number of diverse institutions.  The Collegiate Learning Assessment (CLA) instrument, a device designed to measure critical thinking skills, is their key measure of academic performance. 

Arum and colleagues interpret their data – which tracks students through 50% of their undergraduate college experience – to suggest that today's college students are "academically adrift."   This catchy phrase is intended to convey that today’s college students lack purpose, lack an understanding of the academic foundation necessary for aspired career paths (or a lack of an aspired career path altogether), and that today’s college students spend more of their waking hours on non-academic activities than they allocate to their studies.

"Academically adrift" makes great headlines. However, the connection between the reported findings and the “academically adrift” conclusion is tenuous. The authors provide no historical data to support that what they observe in today's college students is any different than what a similar study may have revealed if conducted twenty, thirty, or fifty years ago. They provide no evidence that today's college students are any more (or less) "academically adrift" than college students of yore. Further, the conclusion appears to emanate from the authors’ normative stereotype of undergraduate college students specifically, and the undergraduate college experience, generally. The reported data could be interpreted to suggest that the data don’t align with the authors’ stereotype of the undergraduate college experience.

The authors’ observe that many college students lack a clear understanding of their career path and of the knowledge, skills, and abilities (KSA) necessary for career paths they may aspire to pursue. And, some undergraduate students are found to be unclear on their career aspirations.  It is unclear to me how this lack of understanding or clarity of career aspirations affords evidence that students are academically adrift in college. This symptom could evidence failure of multiple systems: our high schools, parenting, and/or other institutions that socialize young Americans about their options ahead of their progression to college. Besides, isn’t it common knowledge that one purpose of the traditional four year undergraduate experience is to afford opportunities to identify one’s destiny?

Courses that require reading and writing are are key factors in improved CLA performance. Whether a student took at least one course that required more than 40 pages of reading per week and at least one course that required more than 20 pages of writing over the course of the semester.  It is difficulty for me to conceive of any college course that does not require at least 40 pages of reading per week and in which students generate at least 20 pages of written material. That said, it appears that such experiences are in fact rare for many of the freshman and sophomore college students in the CLA database.

The analysis of how CLA performance varies across fields of study is problematic (p. 104). Their data suggest that business students evidence significantly lower scores on the CLA relative to all other fields studied (science/math, humanities/social sciences, health, engineering/computer sciences). This finding holds after partialling out variance attributable to other factors (e.g., social background, academic preparation, prior CLA performance, institutional factors, reading/writing requirements encountered in college course work). This finding is problematic because most undergraduate business schools do not admit students until they achieve junior standing and they survive a screening process. This implies that students self-identifying as business majors as freshman and sophomores are not yet in the business program; they are taking courses in hope of successfully achieving the GPA and other requirements for enrolling in a business program. (These students might be more usefully classified as 'undeclared.' ) A consequence is that we would expect substantial variance in the ability of freshman and sophomore students that self-identify as business majors. The performance of business students on the CLA would be more appropriately measured if the analysis categorized as business majors only those students that successfully matriculated into a business program at the end of their sophomore year.

What explains variance in CLA performance? Two class activities consistently emerged as predictors of CLA performance. One, students that took a course that required more than 40 pages of reading per week tented to perform better. Two, students that took  a course that required writing 20 or more pages during the semester  tended to predict better performance. Students that fell into both of these categories – i.e., took a course the required reading more than 40 pages per week and took a course that required writing more than 20 pages over the course of the semester – proved even more powerful as predictors of CLA performance. The authors suggest that courses with this sort of academically rigorous activity further in meaningful ways the critical thinking skills the CLA is designed to measure.

Unsurprisingly, academic preparation also emerged as a significant predictor of CLA performance; better prepared students tended to exhibit greater increase in CLA scores in the first two years of their college experience. This finding speaks to the preparation of students for college and not to the value added by the college experience.

"This pattern suggests that higher education in general reproduces social inequality" (p. 40). This seems a specious conclusion.  Implicit in this statement is the assumption that differential ability on the input end of a process will somehow disappear or be eliminated by the educational process.  A fairer conclusion is that higher education, in general, does not eliminate differences in ability.

The final chapter, A Mandate for Reform, strikes is oddly disconnected with the reported findings. The chapter reads more like the author's dream for the educational process rather than a discussion of potential implications of their findings.

The data for their study were collected from freshman and sophomore students. Accordingly, the findings afford some insight into what happens during the first two years of college for a sample of undergraduate students attending a finite number of institutions. Consequently, the data set affords a platform for offering recommendations for enhancing the effectiveness of these first two years of the college experience. To extrapolate data collected from freshman and sophomores to the totality of the undergraduate collegiate experience is unsupported by their data.

This is too bad. Their findings, to my eye, afford the foundation for some very pragmatic recommendations. For example, it would seem to flow naturally from their data that institutions should expand the number of courses that require reading more than 40 pages per week and/or require writing more than 20 pages over the course of the semester. Simple.

Ultimately, Academically Adrift affords interesting insights into the experience of some students, at some institutions, during their first two years of college. Arum’s extrapolation of their findings into an indictment of the entire educational system -- as they choose to do -- takes their rhetoric into territory not supported by their data.

2/12/2011

Sand in the Vaseline: The Ed Schmidt Service Experience

For reasons that escape me currently, I took the '03 Passat to Ed Schmidt, the local VW dealer, yesterday for a scheduled oil change, and to have a couple items checked out.  Thinking it would be a short event, I decided to take advantage of the WiFi in the waiting area and work while the car was serviced.  How long can an oil change and a quick diagnostic take?  Over two hours, as it turns out.  Yep, for a scheduled appointment.

Oh, well, as the service dude explaind, the "secondary hood release" dongle needed to be replaced.  He interestingly explained that it was worn.  That's more than hilarious for a variety of reasons. Hint: as far as I can tell, it was missing altogether. Guess who last did work on the car? But I digress.

This morning, I found a pool of oil under the Passat, Dark, dirty oil.  Huh?  Again?  Yeah, this has happened before. And has only happened after an Ed Schmidt oil change. One time, rather than remove the aerodynamics improving belly pan that covers the oil plug, the crack Ed Schmidt mechanic emptied the oil into the belly pan.  Naturally, I'm wondering: did they do that again?

Just after discovering the oil puddle in the garage, a Customer Satisfaction Survey from Ed Schmidt landed in my email.  So, I launched the survey.  All I wanted to do was query about the oil puddle. The genius that designed the VERY LONG survey configured it such that a respondent must answer every single question in order to submit the survey.  All I wanted to do was submit a question. Nope, can't do.

Ah, ha! Instead I'll reply to the survey invitation email. An email sent to rrandall@edschmidt.com should get my question to a living breathing Ed Schmidt employee. Right?

Nope. Email sent in reply to the customer satisfaction survey generates this response:

Delivery has failed to these recipients or distribution lists:
rrandall@edschmidt.com
The recipient's e-mail address was not found in the recipient's e-mail system. Microsoft Exchange will not try to redeliver this message for you. Please check the e-mail address and try resending this message, or provide the following diagnostic text to your system administrator.
Brilliant!

So, I pulled up Ed Schmidt's web site in search of a contact.  A live chat dialog appeared. A Todd queried as to how he could assist me.  (Immediately, I'm wondering if he has a sister or girl friend named Margo). OK, game on!  I shared with Todd  the fact of the pool of oil on the garage floor and the saga of the bounced customer satisfaction email. Todd promised to have someone contact me.

Amy, Ed Schmidt's Internet Manager, called me a short while later. She promised to have the service manager call me and asked that I forward to her the customer satisfaction survey invite email with the bouncy return address. I did.

Jeff, the SM, called a short while later, and "Sir'd" me a lot. Jeff says, "sir" in a way that implies an effort to put one in his/her proper place (not to be confused with the respectful "sir!" as voiced by military personnel). "Sir, I can't know for certain with out examining the car, but it could be, what we cal a 'messy oil change.' Sometimes the guys fumble the oil filter when removing it and oil gets spilled in nooks and crannies in the engine. You would need to bring the car in, sir, for us to examine it and make sure everything is OK."

So, this is a problem that happens so frequently that Jeff has a name for it?  Not a good sign. This is the fourth time a simple oil change has required a repeat trip to Ed Schmidt to have post-service service performed on the car. If I've experienced problems subsequent to oil changes at Ed Schmidt this often, it suggests major gaps in their quality control.  Rather than name the error, how about instituting processes aimed at eliminating the occurrence of the error?