LITA Top Tech Trends

This was a really great session. The speakers were engaging and the topics covered were awesome. That’s not to take away from any of the other sessions I attended, but Tech Trends just happens to be in m wheelhouse.

 


(https://www.chargespot.com/news/state-of-wireless-charging-the-present-and-the-future/)  

  • Cloud Computing- Chromebooks to replace traditional PCs. Georgia worked with Google and added print and time management.  More info http://galibtech.org/?s=Chromebook
      • Q 1: Is it scalable? Google Cloud Computing YES!
      • Q 2: Is it affordable? Dist Charging = No! Very $
      • (@mbreeding) Open Source ILSs haven’t changed the game but have shaken up the game. ABout 12% of pub libs run open source ILSs
      • Uneven Access to technology is a BIG deal.
    • Theme 2- New models of innovation and entrepreneurship
      • (@tmradniecki) Makerspaces – Most cater twd youth, which is usually cheaper, too. Sewing machines can be popular. They have a laser cutter. Multimedia creation is a makerspace offering. So with our DML and 3D Printer and mindstorms we technically have what we need to start. The ability to make prototypes is a huge benefit for entrepreneurs. Some academic libraries are starting to create networks to eval makerspaces and come up with makerspace competencies. When starting a makerspace, make sure you can answer WHY you’re creating it. (@vjpitch): Biggest mistake libraries make: trying to do everything for everyone. We end up doing nothing for anyone.
    • Theme 3- New models for partnership
      • (@ACDH_OeAW) Social Media Outreach- They are designed to have two-way conversations with your community. That makes them a powerful tool. Need to assign enough staff to handle the load. Don’t be so formal! It’s essentially small talk. Don’t just tweet events and stuff. Don’t be afraid of personality. It’s also ok to have different personalities, that will broaden your reach because each person reaches a different staff. We do most of this well, but might want to question the single voice decision.
      • Open Licensing- Open access and licensing are both very important. In a digital space the whole world becomes your potential target, so you need to make sure everyone is able to use what you offer. Creative Commons is a key to success in this area. A license is only useful to potential user if they recognize it. Hence CC. There seems to be a huge fear of open licensing in the US but not the EU.
    • (@vjpitch) eContent Revolution– Focus on Access, content, and experience. http://popuppicks.com/ (Links to biblioboard collection) Forget signons, use IP authen to allow access. Right now content is being given free for 3 months. Stop the “If you build it, they will come. You should reach peeps where they ARE.” eBooks is a print solution to digital problems. We can’t be afraid to put down services that aren’t being used.
    • Libraries are more guilty of lack of security than vendors. Example, we should run our website on HTTPS. All social media is https, Google is https.

Command Line Copy

EPicture of Linux Terminal Window with Codevery year we digitize The Wilmington Town Crier papers. We send ’em off and get pdfs
on a USB drive (as well as the originals) returned. From there I load them into a Drupal website using Solr and Tika for indexing (a setup I don’t recommend). But in order to do so, the pdfs must all be in the same directory.

Well, it turned out that the 2016 batch of papers put each issue in its own folder. Since I didn’t want to have to manually move all the files, I searched for a way to do it all at once. And I found it.

1.) Open a command prompt (start > run > cmd)

2.) Change directories to the USB drive

3.) Type the following code, replacing “target” with the folder you wish to drop all the files into.

for /r %f in (*) do @copy "%f" target

For example, here was my code:

G:\2016>for /r %f in (*.pdf) do @copy “%f” G:\2016

You’ll notice I only wanted to move the pdfs. That’s because there were other files (.jp2, txt, tif) in those folders and I didn’t care about them. If you want to move everything just leave it as (*).

And that’s it!

Since I want to give credit where it’s due, here’s where I got the line of code: https://stackoverflow.com/questions/1502170/how-to-copy-files-from-folder-tree-dropping-all-the-folders-with-robocopy

Auto Start FireFox in Ubuntu

Picture of a fox with his tongue stuck on a window

Image Courtesy of  DeviantArt.net

We use old desktops running Ubuntu for catalog stations throughout the library. If the computer is rebooted, since most folks aren’t familiar with the Ubuntu OS, they don’t know how to open the catalog. So I did some research and found a fast and easy way to open FireFox to the catalog at system startup. Here’s how.

  1. Open a Terminal window
  2. Type cd .config/upstart/ and hit enter

Note: If the directory is not found, you’ll need to type ls –a to show the hidden files first.

Terminal Window Showing CD to the .config/upstart command

  1. Type nano firefox-with-url.conf and hit enter

Note: if you get a permission denied, try typing sudo before nano. Of course you need to have sudo permission to issue that command, so you may need to resolve that before continuing.

Terminal Windows showing how to create a .conf file

  1. Type the following

start on desktop-start

stop on desktop-end

 exec firefox http://wilmington.mvlc.org*

*Put the page you want FireFox to load here instead of this

  1. Hit ctrl+X to exit
  2. Hit Y to save the changes and hit enter.

The commands to autostart FireFox in a .conf file.

And that’s it!

Robocopy for Backups

Office Space Robocopy Meme

The Problem

The library uses a Synology NAS device for staff documents. The original hardware came with the Data Replicator Software that would back up the contents of the NAS to a USB Drive attached to one of my computers. A few months ago said backup started to take upwards of 18 hours. So I went a-googling for answers. Turns out Synology discontinued that app years ago. So I needed to find a replacement.

When I couldn’t find any free software, my mind wandered back to my previous life as an applications analyst. Specifically to Robocopy. Alas! I’m sad to say the “robo” does not stand for “robot” but rather “robust”. Heartbreak aside, it turns out I could easily and, more importantly, quickly create a backup copy of the NAS myself. I simply created a .cmd file to robocopy the NAS to the USB Drive and then created a scheduled task to kick off the file every night. Here’s how you can do it, too.

Creating the .CMD File

Start in c:\Windows\System32\ 

1.) Right-click the any white space

2.) Hover over New

3.) Click Text Document

4.) Type Robocopy Backup.cmd and hit enter

Note: Make sure you change the extension (the part to the right of the .) from txt to cmd.

5.) Click OK to the warning about changing the file type.

6.) Right-click the Robocopy Backup.cmd

7.) Click Edit

Screenshots of how to create a robocopy.cmd file on the desktop

8.) Type Robocopy \\SourceServer\SourceShare\ \\DestinationServer\DestinationShare\ /MIR /R:2 /LOG+:RobocopyBKUP.txt

Note: Make sure the entire command is on one line or else Windows won’t understand.

9.) Hit ctrl+s to save the file.

Robocopy commands with explainations

Task Scheduler to Kick off the .CMD File

10.) Go to Control Panel > Administrative Tools > Task Scheduler

Path to Task Scheduler

11.)  Go to Action > Create Basic Task

12.) Type Robocopy Backup in the Name: field and click Next

13.) Make sure the Daily circle is selected as a Trigger and click Next

14.) Specify a time (you probably want to make sure this is when you’re closed) and click Next

15.) Make sure the Start a program circle is select and click Next

16.) Click Browse

17.) Navigate to the Robocopy Backup.cmd you just created and click Open

18.) Click Next

Basic Task Specifics

19.) Make sure all the settings are correct and click Finish

All Basic Task Settings

You can run it manually to make sure it works by clicking the task to select it and clicking Play on the right

Troubleshooting

Since each computer is slightly different, things may not work exactly like I say.  Click the task to select it and click Properties on the right and then check the following settings:

  • Make sure the Run whether user is logged on or not circle is selected
  • Make sure the Run with highest privileges box  is selected
  • Make sure the ID you’re using has access to both source and destination paths

 

Change user on Basic Task

For a list of robocopy  command, see here: https://ss64.com/nt/robocopy.html

Site Hiding from Google

 

Thank you card with the Alphabet on the cover

This Is the Thanks I Get.

 

Credit where credit is due: I used the article below as a guide to solve this issue:

https://www.seomechanic.com/why-is-my-website-not-showing-in-google-search-results/

The Sitch

I had a patron come to me asking if I knew why their site wasn’t showing up in a Google Search. At first, I thought they were simply using too broad a search phrase. Turns out they were hacked and the hackers messed with some pages and their Robots.txt file. Oh, and did I mention it was a nursery school? Who hacks a nursery school?!

Sorry.

The recovery process is painful and time-consuming. So I wanted to share my experience in hopes I can help another unfortunate soul.

Search with Site URL

Search Engine Optimization (SEO) is the process a web designer undertakes to aid the search engines in finding her site. There are many ways to do this and SEO is something we could spend weeks on. Suffice it to say, the main way a site is found is through keywords. Since SEO is also so competitive, your site might just not rank too high. A surefire way to see if Google sees your site is you go to google.com and search your URL exactly as it appears but put site: before it. For example:

site:bioniclibrarian.wordpress.com

Confirmation of Penalization

If Google sees it, the next thing you should check is if Google has penalized your site by going to https://ismywebsitepenalized.com and putting in your site name.

imwp

If you get an ugly read error like the one above, then you’ve got work to do.

Make Any Edits To Robots.txt

The easiest fix is to make sure the Robots.txt file is allowing all the search engines to index the site. In my poor patron’s case, the hack messed with this and disallowed certain agents. We reset it to the default:

User-agent: *
Disallow:

You can read all about how to use the file here:  http://www.robotstxt.org/robotstxt.html

If you want to see one in action, click here: http://www.robotstxt.org/robots.txt

Make a Sitemap

Sometimes the penalty is because you don’t have a sitemap file. A sitemap is a file (usually xml) that lists each page on your site. If you have a simple page, you can just create it yourself. If you have a site that’s a little more complex or don’t want to create it yourself, visit http://xml-sitemaps.com and you can create on there. Caveat, if your site is over 500 pages, this site won’t be able to help you.

.htaccess File

Once we added the sitemap and uploaded it to the site, Google was saying it STILL couldn’t see the file, even though we were looking right at it. I thought it sounded like a permission issue, but we didn’t have access to change the permissions on the HTML root folder because the login we had dropped us in that folder.  Enter the .htaccess file.

I’ve run across this file on one of the websites I manage. But in those cases, it had to do with permissions. Apparently, it can also be used for redirects, which is where this journey ended up going.

The patron himself ended up finding this issue and ultimately resolving it. I love it when that happens. That means I’ve taught him how to fish…so to speak.

Here’s a resource on the issue: http://www.htaccess-guide.com/redirects/ 

Make a Reconsideration Request

If your site still refused to show up in Google, you can always make a Reconsideration request. The process is essentially this:

  1. Sign into your Search Console account.
  2. Verify all versions of your site to ensure you have complete and accurate data.
  3. Visit the Manual Actions section to see if Google has taken any actions on your site.
  4. Fix issues on your site as described by the manual action.
  5. Review Security Issues in Search Console for other possible issues with your site.
  6. Click on ‘Request a review’ to ask Google to reconsider your site.

I took those steps from here: https://support.google.com/webmasters/answer/35843?hl=en

 

Digital Divide Podcast

Interpretation of the Digital Divide showing a small bag of money on one side of a valley from a big bag of money. Across the valley is a series of papers to act as a bridge

One of the reasons I decided to go into public libraries and not academic libraries upon finishing my Master’s degree was because of the dastardly Digital Divide. For those of you whom may not know what the Digital Divide is, Merriam-Webster defines it as:

the economic, educational, and social inequalities between those who have computers and online access and those who do not

My library takes an active role in building this bridge through such things as my one-on-one tech help sessions. The sky, or should I say cloud (what? I shouldn’t? A-hem. Well OK then.) is the limit. I help patrons with eReaders, email, websites, social media sites, anything they ask. While I don’t claim to have all the answers, I can do some research beforehand to find a particular one. If that doesn’t work, then I can show them how to use Google and our databases hidden behind paywalls to get the answer. They’ve proved muy popular in my time here. I had 160 sessions in 2016 alone. But the thing is, I don’t make house calls. So that means patrons must come to the library for tech help. Until now.

One of those aforementioned sessions was with a lass from WCTV, the Wilmington local access cable channel. We went over some of the basics of computers and she found it so helpful that she thought we ought to do it again. This time in podcast form. So we did. Our first episode of Bridging the Digital Divide can be heard here:

I’m excited about this. Not only is it a fantastic example of how libraries can partner with companies to meet the patrons where they are but it also provides me with a chance to use new (to me) technology. I’ve never been in a recording studio and it was fun to drop some killer beats. Well, maybe not killer beats (I need to stop saying that don’t I?) but a useful recording in the very least.

Stay tuned for more episodes.