Search This Blog

Wednesday, December 8, 2010

Unit 13 - in retrospect

I learned a great deal this semester about configuring open source software to create a functional digital repository. I also learned more about managing repositories (both on the people and technology sides), while I learned a great deal about why it is important to digitize our cultural collections. I appreciated how many server configurations we were assigned to do, and I feel pretty confident now in maneuvering through the command line. I’m still struggling to find how everything fits together, and specifically with trying to grasp the overall management components necessary to effectively curate digital collections.

One of my favorite management readings of the course was the British Library’s vision statement. Yes, I know it is a study in careful marketing, but this document made me realize again why it is that I want to be involved in digitizing cultural collections. Specifically that the immediacy and accessibility of historical digitized content may have a profound impact on content creators via the ability to "revisit" history through such artifacts.

Another part of the course that I really enjoyed was exploring the larger federated repositories of Trove & Europeana. Both of these search engines mine the “deep web” and I was able to find new information from collections that I’ve visited before. I also enjoyed the faceted searching and the variety of ways in which to search for information that both of these sites offered. I especially appreciated the exhibits that were available through Europeana, particularly the Art Noveau exhibit.

Best of all was configuring Dspace & Omeka. Eprints was probably the most painful for me to configure (lots of troubleshooting), however I was quite pleased with the end product. I encountered few issues in configuring Drupal, however the interface was my least favorite; the high customization in creating the taxonomy (combined with a higher possibility of typos or less standardized terms) seemed less secure in anticipation of migration.

Sunday, November 14, 2010

Unit 12 (or unit 11, part 2)

I think the time involved in the standard install has at this point become an extra two hours or less in the overall unit exercises. When considering this exercise in light of the fact that a repository will be around for a while (theoretically), I can’t see how it is that much more time consuming than something ‘off the shelf’ in the short run. In the long run, I’m guessing it’s better to build your repository from scratch as we’ve been doing. Because if you know how something is put together, you’re far more likely to know where to look when it breaks. More than a few times when I was unable to utilize Fugu for transferring files, I had to instead utilize the command line. This was a great learning experience, and I understood the server better after having gone into the files. Once, I could not see an xml file that I had uploaded, and I was able to see the entire XML file when using the command line at the terminal. I was able to practice some of the commands to get inside a particular directory, and this was really helpful.

I have far more confidence in my abilities than I once did, but I know that I have a lot more to learn, and that some of this will be learned on the job. I'm already concerned that next semester I won’t have many opportunities to go into the command line, and I guess that I'm already somewhat wistful for this dark screen:) I do think it would have been better if 675 was right before my capstone project, because I know that in some way I would like to be involved in repository configuration. One thing I’m curious about, do you have the same ability to create snapshots outside of a VM. I’m guessing this would be the typical server backups that most institutions have. Ultimately being able to take snapshots gave me much more confidence through both 672 & 675.

Tuesday, November 9, 2010

Unit 11 - Omeka


I have enjoyed the end product of all the repositories so far, however I would have to say that the user interface for Drupal is my least favorite. Dspace was at the top of my list until this week following the installation of Omeka. Omeka is by far the most superior in terms of the graphical interface, as well as in the flexibility of the configuration. Its presentation piece seems to suit my collection very well, especially since the collection consists of a variety of file types, including MOV, MP3, PDF & JPEG. I also appreciated the ease with which I could add metadata fields, as well as on the presentation side how easy it was to access and play my files, especially my film & music files.

I still appreciate Dspace and its communities and collections, however I believe that for my final project I will be utilizing Omeka. I appreciate JHOVE for all the technical metadata it revealed about my objects, and I liked the interface and searching capabilities of Eprints. I also loved harvesting Metadata by utilizing the base url and the OAI harvester.

To date this has been my most challenging semester, based in part on the advanced command line configuration we’ve had to do (and the subsequent troubleshooting that has had me pulling my hair out at times☺ , as well as the more challenging, analytical and critical management discussions that we’ve had this semester in evaluating larger federated repositories. I’m starting to see light at the end of the tunnel, so to speak, and I’m really starting to think seriously about my capstone project.

Tuesday, November 2, 2010

Unit 10 - an effective federated repository

I looked at the OAIster database sponsored by the University of Michigan digital library. I couldn’t get a sense of what the collections were. All the landing page states is that it provides access to millions of digital objects from thousands of contributors. It wasn’t like the Europeana federated repository, where the first thing you see on the landing page is a featured Art Noveau exhibit. This was a great way to be introduced to the site. I’m not too excited to search a federated repository that features nothing. Of course, if I know something already about this repository then I guess I wouldn’t need a polished looking landing page, but really, how many people will the site lose by not highlighting one or more of its partner collections?

I then visited the Sheet Music Consortium. The funny thing is that I entered a keyword search for Swan Songs. The first two records that came up for memorium songs to Bert Williams, an African American vaudeville performer who was active during the early 20th century. There were no results for swan songs in quotations.

I also looked at Nora (Noreweigian Open Resource Archive), and thought it was interesting how you could limit the topic by what appears to be a Mac file structure. All of the results were listed in Norweigian. I thought this was an interesting interface, especially the way in which the topics were divided according to the file structure.

The interface with the best design was by far the Sheet Music Consortium, but it was the only site that seemed to have made an attempt at creating a visually appealing landing page. This is where a federated repository such as Europeana is far and away a more appealing interface, regardless of Erway’s measured criticisms.

Tuesday, October 26, 2010

Unit 9: cataloging my Sutro District collection

I have two journal articles in my collection, and I think it can be helpful to try to find the abstracts in a noted database such as Ebsco, and then see the subject terms listed in the record. Additionally, I have two films that are part of the Library of Congress’ American Memory Project. Subject terms are quite explicit in these records and have been helpful in focusing subject listings of my collection.

I believe that I’ve been the most consistent with Dspace regarding my collection, as with Dspace I was able to utilize both ATT & LCSH – additionally, my LCSH listings were fairly granular in Dspace. With Eprints, I could not achieve the granularity with my subject listings that I could with Dspace. For instance, with Eprints I could only choose Psychology for some of my objects when what I really wanted to choose was ghosts, a subheading under Psychology. Additionally, I like using ATTs definition of ruins vs. LCSH definition of ruins, the latter being specific to ruins in antiquity. The ATT definition of ruins is broader, encompassing contemporary ruins as well. Obviously the Sutro Baths ruins cannot be described as ruins from antiquity.

Overall I’m enjoying cataloging my collection, but I can see how good, detailed cataloging can be very time-consuming and thus very expensive in a real working environment. But I can also understand how you wouldn't want to cut any corners when it comes to cataloging, as the metadata is key to access; if no one can find your collection, it may as well not exist.

Monday, October 18, 2010

Unit 8 & the perils of Eprints

So far I prefer Dspace over Drupal & Eprints. It has been helpful to have to do the standard install now for the 5th time. I think I’m just starting to get the hang of it:) I still had some problems this time with connectivity, but I think that’s because I configured the static IP before doing the sudo aptitude updates.

I should have read Emily’s post earlier about mixing up the primary user password with the necessary mysql password, as I had to repeat this step 3 times before realizing that it needed my unique mysql password.

Another issue that came up was when trying to configure eprints with two repositories. Here I changed my hostname to eprints324, the name of my new vm. I really appreciate all the help that the tech activity has offered, and when I find the solution through either Bruce’s or another classmate’s assistance, I’m trying to add as much detail as possible to the posts in the event that others run into the same issue.

I had a heck of a time finding private/etc/hosts on my actual host machine. This was a very interesting exercise in patience and I must admit that I learned quite a bit about changing permissions for these files on my host machine. I had little time to configure my Eprints site by the time I actually got it up and running, but alas both repositories are ready to have collection items added. I’m still very much partial to Dspace. I appreciate its user-end appearance over that of either Drupal or Dspace.

Tuesday, October 12, 2010

Unit 7: Capstone Project Musings

Since our blog discussion this week is a subject of our own choosing, I thought I would discuss my dream capstone project. While I wouldn’t mind working on a project at my home institution of the UCM Library, my dream collection would be the Adolph Sutro Collection at UC Berkeley. This collection to my knowledge has not as of yet been digitized. I know a few of the librarians at Berkeley, and I’m thinking that I need to start putting my feelers out there to see if there is any chance that I would be able to participate is such a project.

The finding aid on OAC shows that this collection is not currently available in digital format. I know this is a pipe-dream project more than likely, and I’m not even sure if there are any plans to digitize the collection in the near future.

Another possible project might involve a digital-humanities collaboration with the writing program here at UC Merced. I am involved in a group of mostly writing professors here on campus called the “Critical Theory Reading Group.” This group meets every few weeks to discuss readings by philosophers and cultural critics such as Jacques Derrida, Walter Benjamin and Siegfried Kracauer. I think that it would be really interesting to prepare an archive of digital objects related to seminars and projects sponsored by the group.

Tuesday, October 5, 2010

Dspace (Unit 6)

I had problems finding the webapps line during the dspace installation, the first time I used the control-w command (so happy to know about this search command!), it could not locate the file. So I went back a few steps in the install and edited the additions to the /etc/default/tomcat6. This time I left a space between the last line and the beginning of the additions (tomcat user & security). I saved, and when I got back to the stomcat6/server.xml I was able to locate the file for editing.

I think that some of my confusion stemmed from the fact that I didn’t know where to stop in the standard install and then begin in the Dspace configuration. Also I experienced some issues with the static IP I initially assigned. I'm not sure if this was behind my connectivity issue, however I was able to ping successfully after using the IP I originally used for the last VM.

I really like the Dspace interface overall, especially the organizational hierarchy of creating a community, then a collection, and then adding your items. However collaboration too is inherent in the organizational structure in that communities can share collections. I'm looking forward to adding all of my metadata fields as established in my application profile. I think this will make my collection items that much more accessible and the item records much more robust.

Lastly, I liked how easy it was to add a logo & images to my Dspace site. I had to reformat my logo a few times, and the process was very straightforward and quick. Perhaps it's just me, but I felt that Dspace was far more customizable, and that the administrative functions were clearer - there was one point during the Drupal customization that I felt the page looked very crowded and confusing to navigate. I did not have this experience with Dspace, but perhaps that's because at this point my metadata is pretty sparse. We will see...

Monday, September 27, 2010

Installing Modules (unit 5)

I decided to add the Page_title module to my Drupal site. According to Drupal.org, this module allows granular control over the page title display. After downloading the contents of this file to my desktop, I realized that I could just use the command line to add this module by referring to the download link in the release notes and using the command line instructions in the tech activity. After downloading via the vm, and going back into Administer > site building > modules on my drupal database, it showed that this module was disabled because it required an integrated token module. So I found this module’s release notes on drupal.org and from the vm executed the same commands to download this module. I was really excited when the page_title module was enabled after downloading the token module. I would imagine that the more modules you download, however, the more complicated the back end of the drupal site becomes.

The only part of this week's exercises where I got stuck was when trying to configure the faceted search in the VM, but I was able to figure things out after realizing that I had a misplaced asterisk. I also got stuck at the command line when trying to download the Date module, however in one of the activity posts Bruce supplied alternate text that worked instead.

Monday, September 20, 2010

Unit 4: more fun w/ Drupal

I think that Drupal is going to work very well for my collection of objects focused on the Sutro District. As professor Fulton mentioned in one of postings this week, Drupal is very forgiving. Since I want to develop a taxonomy using more than one resource (LOC, AAT, etc), Drupal seems to be the way to go.

My collection is in some ways an homage to the Sutro district, and several of the items are art objects inspired by the district. Thus a varied taxonomy feels like the correct choice, however we shall see just how complicated this may become.

My drupal address is http://192.168.1.99/drupal/. I had some difficulty getting to this page through either of my browsers today, but then I realized it was probably because I was logged in to the VNC for UC Merced. I thought it curious that I couldn’t log in with user2 (?), but the authuser name & pw worked. However I didn’t have administrative privileges (arrh!). So I went back through the drupal install doc and realized that I needed to login in as admin – voila! This turned out to be the very least of my problems as I worked my way through modifying the file upload parameters, and eventually created my vocabularies (that part was fun!).

Sunday, September 12, 2010

Unit 3: class pace

I believe that the pace of the technology assignments is perfect. I feel as though I’m able to closely follow the instructions without too many setbacks. I also feel that the technical rigor of this summer’s course really helped hone some of my troubleshooting skills. I’m trying to keep up with reading the discussions as the course progresses, whereas last semester I had to take a few days to play catch-up and read all of the past discussions. This semester feels much more on track so far.

I’m also enjoying the management readings, podcast and discussions. I didn’t notice a quiz tab yet so I’m guessing that there will be no quizzes this time around. I think that I will miss that component of the course, as the quizzes were really good exercises in learning the more technical details of the class.

Another interesting observation is the cohesiveness of the students in the discussions. Most of us have been through at least two semesters together, and this is very different from my college experience, even as a graduate student (most of my graduate coursework was completed overseas at Oxford Brookes).

Overall I’m really enjoying this class so far, and particularly the emphasis on taxonomies. I want to learn as much as I can about this subject, and find it fascinating especially when compared and contrasted to folksonomies.

Monday, September 6, 2010

Unit 2 & the CMS

The article that I reviewed is Regina Beach and Miqueas Dial’s “Building a collection development CMS on a shoestring.” The key players in developing the CMS for Texas A&M University – Kingsville (a small campus of 5,000) started out by considering all students as potential distance learners, and from there developed a collection development CMS that could be used remotely, whereas the previous protocol had been to require that faculty and students submit paper forms (yikes!). I think it’s interesting that a faculty member (also one of the co-authors) was invited to help develop the CMS. This seems like a great idea in developing library services – to get the faculty’s input especially where access is concerned.

I thought it very interesting that the authors mentioned that the long term professional and paraprofessional staff who had worked together for several years were in fact crucial to the smooth implementation of technological change. It has not always been my experience that long-term staff are open to change, so I thought this refreshing and that it also said much for the author's positive and empathetic approach to this project. The overall focus of the authors’ article is moving from a book request system using paper forms to online forms. The article goes on to address the ease of access of the new system, and how much of the record data input was automated.

Having myself experienced the transition from a library website based on Dreamwever to a CMS (Joomla), I can honestly state that the CMS has allowed for greater ease and freedom in making changes to the website, ultimately benefiting the users in that information is disseminated as quickly as possible via the website. I'm not sure that the same ease, flexibility and (most importantly) speed in making information readily available would be possible if we were still using Dreamweaver.

Tuesday, August 31, 2010

675 - Unit 1

The collection of objects that I would like to discuss and which I think may be of interest of others with a shared fascination of history are objects related to the Sutro area in San Francisco. My collection will consist of some photographs and possibly a few drawings and films. This site is located in an area known as Lands End on the western coastal side of San Francisco. The land is federally protected, but it used to have a large glass bathhouse that was built in 1894. The bathhouse burned down in 1966, but luckily instead of being developed as was the original plan, the federal government became the custodian of the land so that the ruins of the baths remain. I have long held a fascination for this site and have taken several photographs over the years. While none of the glass remains, the foundation and pools are still there, and is one is brave enough you can walk around the periphery of the large dark green pools.

People interested in ruins, San Francisco history, naturalists, etc, may find these objects of interest (hopefully!). My master's thesis was about the ruins of a 1923 movie set in the Guadalupe dunes on the central coast of California, so I've continued my fascination for these unique environments by trying to capture images of the Sutro Baths. I think some of the terms that could be used might include Adolf Sutro, Lands End, Cliff House (another institution created by Adolf Sutro, and still in business after more than 100 years), San Francisco & ruins. I might also include some postcards of the original baths that I have collected, as I think this will make for a more varied taxonomy.

Tuesday, August 10, 2010

Unit 12 - the end...

Plan and do from PMBOK guide reiterated what I’ve learned in 673 on project management. Timelines are essential, as are the lines of communication. This was emphasized by the first reading, Cervone’s project risk management. I think this is one of the best reasons to have a CMS for managing projects within the organization. Within a CMS you can track all the projects with which you are involved. The leader of an organization should be able to go into a CMS and view the various stages of projects within programs throughout the organization. Our University Librarian does this with our CMS – if one is viewing it a certain way then one can gain something of a snapshot of programs and projects going on throughout the organization.

Cervone contends that when making a decision two forces are at play - the courage to make a decision along with a sense of caution about the consequences. I couldn't agree more with this summation. You have to be brave as a decision maker, but you also have to be frugal and conservative to some extent; you have to consider how that decision will affect others and the organizational workflows, but at the same time if you want to nurture a creative and dynamic organization then you need to be able to make decisions that will help the organization grow and flourish.

I also appreciate Cervone's point that if you wait too long to complete a project that it may turn into a totally different project. I think this idea ties in nicely with Unit 8 and how we explored the difference between long-term strategic planning and the shorter timelines inherent in technology plans. With our rapidly changing technological landscape it seems wise at every juncture to ask "why are we doing this?" Modifying a plan as it progresses is an important part of change management, as is insuring that the project is executed in a timely and organized manner.

Sunday, August 1, 2010

Unit 11 & looking back

I honestly wish I hadn’t struggled so much with FTP during this past unit so that I could concentrate more on the structure of PHP. I did do the standard lamp installation quite a while ago in unit 7, but I still wasn’t seeing the /var/www directory in fugu. Thus I typed everything out in the command line using sudo nano (this was actually quite fun, as the tags light up in the command line – they do in text edit/dreamweaver as well, but it’s not quite the same as in the command line you’re seeing it against a dark background – it was the only light I could see, so to speak, in my very gloomy confusion. But this is how we learn, right?

For the most part I feel pretty good about what I’ve learned, but I know it’s just the tip of the iceberg. I don’t feel as though I can comprehensively articulate everything I’ve learned because it’s all yet swirling around me like a sea of numbers & words. I think there are many people in the class who are able to pick up this information much more quickly, so I’ve felt somewhat less skilled than in my previous two digin courses. With my 50-hour work weeks I’ve found the course load a bit daunting, but that’s the feeling I expect from a graduate program too – it’s not supposed to be easy.

I do agree with prof. Fulton that we should consider picking up certain programming languages, if even just one or two, in order to become proficient. Personally I really liked learning XML and Mysql: XML because of its flexibility and omnipresence in the current work of digital curation, and Mysql because I particularly enjoy its regular expressions and Boolean operators. It also goes hand in hand with this week’s PHP exercises.

I’m looking forward to immersing myself in my final project paper and wrapping up the course over the next 7 days (thank goodness for some furlough time this week!), and I must say that I will miss the rigor that this course has provided – I also appreciate that prof. Fulton has been so available to the class, emailing back seemingly 24/7 when students are struggling with the activities. Anyways, I think it’s time to end my blog as I just did a control + c on a word doc as I would on the VM! Time to go to sleep.

Monday, July 26, 2010

Unit 10 Tables are Zen

It took a while for me to realize that right and left joins are more about the output of the data rather than some 'floating' table that happens to be right instead of left. I have to admit that most of what we’ve been learning has been challenging, but I’ve really enjoyed the hands-on exercises from this week’s unit. I also like that mysql is a language comprised of statements and Boolean expressions, and how you can break up a statement by clicking enter. Sometimes I am premature in pressing enter, and with mysql it is nice to know that by pressing enter I’m not activating a command but just breaking it up. The semi-colon is like the period at an end of a paragraph (it feels that way too, as the inner joins pack a lot of instructions into one statement).

Conceptually I think that learning about tables & mysql is easier to understand than some of the information in earlier units because of the ability to easily visualize the concepts. With the unit on Networked environments, for example, I felt that I had to have a lot of the concepts spelled out for me in order to understand, whereas with tables I think that I’m so used to functioning in a work environment comprised of information in tables that these concepts were much easier to grasp.

Thinking of different queries to run during this week’s query exercise was a challenge but also a lot of fun. Especially when the joins differed from those in the assignments (and especially when the queries worked!). I think I’m coming closer to understanding the difference between a primary key and integer, or in the language of mysql, an int unsigned not null auto_increment vs. an int not null.

Tuesday, July 20, 2010

Unit 9 and I feel fine:)

Some of the SQL tutorials became a little difficult for me once we moved on to the later sections, but everything before that point was relatively straightforward, especially after having done the tutorial in the content section. I really enjoyed listening to Mostafa, but at one point when he moved to integers, I admit that I started to feel lost.

I’m still struggling a bit with relationships between entities and normalization, although the more examples I look at and think about the concepts, the more “ah ha!” moments I have. I’m really looking forward to using Mysql, and exporting a table as a text file. I’m also looking forward to working on our final project for this class. I think that one of the most humbling challenges has been the quizzes, as I usually don’t fare as well as I would like. Although I swear I will go through each and every one again, trying to obtain the highest score, before the final exam.

Overall this unit has been very informative; as I’ve worked with several databases throughout the years, it’s very interesting to think about conceptualizing a database from scratch, the types of queries you will need to run, and defining the separate entities and attributes along with the unique identifier primary key.

Sunday, July 11, 2010

Technology planning

One of the readings from this week's unit that I could most identify with was Stephens’ Technoplans vs. technolust, as I see quite a few people around me enamored with technolust, although I don’t think this is a bad thing so long as it helps the library users, the users being the overriding concern of any technology plan. I also liked Stephens’ point that technology plans are often implemented without full regard as to how the affect the front lines. As a middle manager this is part of my job, to be the buffer between the technology and other implementations from the top and the front-line staff. Procedures and workflows help considerably, but it’s also important to make sure that you’re not overloading your staff.


I really enjoyed Schuyler’s article “Life is what happens to you when you’re making other plans.” I see this all the time with our library technology group. They are constantly dealing with small but time consuming issues that are not written into the technology plans, such as a cyberattack on self check hard drives, viruses in the digital signage software, new printers needing to be configured, etc. I’ve also noticed how our primary technology services coordinator quickly moved up the command chain in the last 3 years, and deservedly so given the responsibilities of his job and the greater part that technology plays not only in the work of staff but in the day-to-day needs of our users.

Monday, July 5, 2010

Introduction to XML

I worked through the entire “basic” section of the W3Schools web tutorials. I thought the tutorials and examples were very clear. I would still like to go through the UACBT tutorials, as I really enjoy these and I like the narrator’s clear and informal presentation. However working 50 + hours a week I’m having a difficult time making it through every single reading, and I thought the W3Schools tutorial very clear. I like the straightforward examples directly under each point made throughout the tutorials, and thought the tutorial made the differences between elements and attributes very clear. I love the fact that XML tags, which are not predefined, allow for a great deal of flexibility. Although with this flexibility is the issue of standardization of metadata.

My XML document included some photographs from my trip to Washington D.C. last week. I linked my photographs according to Professor Fulton’s instructions in one of this week’s activity responses. I didn’t see any errors when opening my XML document with a browser (after some troubleshooting the first time around), so hopefully all went well. I added CSS at the top, but I didn’t see any difference in the form of the document, so hopefully this is something we will go into greater detail later on.

I don't have an MLIS, and my primary jobs at the library are in library services and exhibits, so I'm really trying to absorb the sections on Metadata, especially MARC, Dublin core, MODS & METS, and I'm looking forward to learning more about how XML create interoperability and flexibility across different metadata schemes.

Monday, June 28, 2010

Unit 6 from D.C.

This posting is short and sweet as I'm out of town and haven't been able to complete the router assignments due to being away from my home router. However, I've been doing my homework in between sessions here at ALA.

The html & css tutorials were very helpful, and although I’ve worked minimally with actually writing html, I’m used to looking at the screens and the language via Dreamweaver and our library's CMS which is powered by Joomla. CSS however had always been a mystery to me, so I’m really excited to have learned how to set a background color or font for every webpage that I may create.

I’m looking forward to learning more about scripts, as scripts seem to be key to interoperability between systems, although as of yet I’m not sure to what extent this is true; I know that scripts control the different softwares being used.

Tuesday, June 22, 2010

Unit 5 & the Network Train

There seems to be so much reading required for this class, and everything that we read is crucial to understanding the concepts – sometimes the detail is so technical I find that I have to reread a section 2 or more times before it sinks in. It feels that in part by osmosis and repetition that I’m coming to understand how everything works together. This week’s unit on networking has been one of most interesting thus far, although honestly binary numbers are still throwing me for a local loop (joke). A long train ride up to Oakland allowed me to plow through the Nemeth reading, although again, it’s when I get to the binary numbers and IP addresses that I feel a bit lost.

I guess as far as learning types I fall into the visual learner category, although I think there must be a tinge of verbal learning style in the mix. The videos in this week’s readings were great, especially the interview with Metcalfe discussing the creation of Ethernet and the video on sending data packets across the LAN and internet – this latter selection kind of reminded me of space mountain.

Sunday, June 13, 2010

Adding groups & users: grover, ernie & bert

I thought the installation of webmin to be pretty straightforward. At first when I found the ip address using ifconfig, I didn’t realize that the 10000 had to be added onto the end, but once I typed in correctly all went well.

We were supposed to create 3 users and 3 groups (user name in each instance corresponded to a group of the same name). I chose names that were easy to remember, preferring to go with a sesame street theme so that now I have 3 new users & groups: grover, ernie & bert.

I was a little alarmed when I first saw the message “grover is not in sudoers file. This incident will be reported” but as I looked more closely at the instructions I realized this is the message that I should see since I had not added the admin group in the useradd command line. Although I'm still wondering to who/what grover's incident will be reported?

Adding a user & group through the remote desktop did not seem to go as well as the Webmin or VM processes, as testing files with the sudo nano testfile command in the terminal utility took me to the GNU nano testfile, and from here none of my commands seemed to work. Eventually I just had to restart in order to escape. Overall I found the remote desktop a little cagey and slow, although the GUI utility for adding groups & users was easy to navigate. Ultimately I think that I'm slowly coming around to appreciating the CLI for all its transparency, and the more I type in the black screen (on the VM), the easier it seems to read through the lines of text...

Sunday, June 6, 2010

More command line fun

I found the vi tutorial to take much longer than described in the assignments document. I’m not complaining though – thus far I feel this tutorial has been the most helpful in the hands-on exercises, and think this is in part because of the background I’ve been gaining in the readings, writing down and executing commands as well as looking up definitions.

I like going back and forth between insert and command using esc and i. It’s neat to see that the keyboard can have so many uses. I also like the :help command. This is so much better than accessing help in a different window, which has been my usual experience in using GUIs. Here help appears at the top of your window.

The most extensive configuration that I had done before this class was setting up wireless configuration on library users’ laptops and some commands through telnet for our library’s ILS Millennium. I’m used to the computer or the software “telling” me what is necessary (auto updates, automatic configurations, etc). But with the CLI there is an interesting paradox in that while we’re configuring everything by hand, there is at the same time the ability to make more extensive changes with one command than would be possible with the GUI. Not that I’m ready for that yet, but with repetition & memorization I’m hopeful that all the exercises and readings will come together to form a cohesive whole.

Monday, May 31, 2010

Playing in the sandbox cont...

I had better luck going through the tutorials with two screens, one on my laptop and then accessing the sandbox through my mini. I really should print these tutorials out! But I’m writing everything out in my little definition journal – so archaic I realize, but I still like writing lest I forget. I was successful at utilizing the mkdir command to create a directory titled Mary – now I should put something in it☺. I also had the opportunity to look through an extensive binary file – whew. TG for the q command.

I’m enjoying the semantics of this new language; symbolic links sound interesting – files that point to other files. I also had the opportunity to use the * wildcard to find all text files in a directory. I wish I could have been more successful in the later tutorials, particularly pipes (took me a while to find the vertical line character via the help of fellow Digin student Emily) but I will keep trying…

Playing in the sandbox

Going through the commands this week it was really great to see that almost everything worked as stated in the tutorials, although I had some issues accessing one of the directories from the sandbox server due to skimming over the changed directory listed in the assignments doc. I feel as though I’m learning a new language, although I wish that I could practice with my own files in my system. I don’t seem to be able to arrow through with my Mac laptop keyboard, but I’m going to try on the pc keyboard I use with my Mac mini in a little while. I’m also wondering about the – key on my laptop keyboard, as this did not seem to work with some of the commands when trying to activate the long format.

Copying over the files seemed to work well, as did the simple commands such as ls & pwd – ths latter command is like a compass if you get lost. I’m embarrassed to admit that sometimes I associate the cd with command instead of change directory! Not good when trying to navigate through, as it makes one feel somewhat schizophrenic. I think it will be a while before I use the rm command. Undoable deletion – yikes!

Tuesday, May 25, 2010

more Ubuntu

Delving further into the New to Ubuntu thread, I have to admit that I found the Psychocats Ubuntu guide very helpful even though it is targeted towards Windows users (I’m a mac user, but I started with pc and still utilize one at work whenever I’m using publisher or have issues with mac “unfriendly” programs). This link also includes an overview of the community surrounding open source and information-sharing through the all-volunteer forums. I especially like the page on how to install popular proprietary software, as this seems like one of the first issues I will encounter even if the Ubuntu install goes smoothly. This operating system based on the Linux kernal is starting to make sense! Now I’m looking forward to unit 2 and the actual download…

Sunday, May 23, 2010

Beginning...

This is the class I've been waiting for, the "hands-on" technology course that comprises 6 classes in my graduate certificate program! Already I've got my separate "technology terms" dictionary in the form of a pocket size journal. Perfect for the many terms I will be looking up & writing down through the course of the summer. Funny how I never thought of server in the terms of the client-server relationship. This makes so much sense (having waited tables for a number of years before returning to school, the metaphor hits home).

Looking through the Ubuntu forum, I'm so grateful to the thoughtful posting titled New to Ubuntu? Start here... and I have bookmarked the Ubuntu pocket guide via google books (I'm saving my storage space for the numerous assigned downloads:)