Pushrod

Old dogs, new tricks

ebay_shopping is now a gem (and now at github)

with 16 comments

OK. I’ve jumped on the Git bandwagon, and to celebrate have made the ebay_shopping plugin (a ruby on rails library for eBay’s shopping API) into a gem, hosted at Github. It’ll take a few days before the rubyforge project is approved and loaded up, but you can still download it from github, or with a github clone.

Why turn it into a gem? Well, it was originally a plugin as that was the easiest and quickest way to do it at the time — it was generated from a Rails application, after all. It also made things like getting the initial config (from a YAML file in the Rails config directory) a no-brainer, and meant I could use some of the ActiveSupport methods without thinking.

But over the past few weeks, I’ve been playing around with Merb, and decided these benefits are more than outweighed by the greater portability a gem brings. There’s also the benefit of versioning and dependencies. Finally, with Rails edging away from plugins with 2.1, and the ease of gem generation using Dr Nic’s newgem gem, there’s really no reason to stay with the plugin approach. Enter, stage left, the ebay_shopping gem.

If you’re already using the plugin, there’s no hurry to change. If not, give the gem a try and let me know how it goes. Use is almost identical to the plugin. The only difference is with the initial configuration. You can still use the same YAML config file in your Rails config folder (if you’re using Rails or Merb), you just need to set it up explictly in your environment.rb

EbayShopping::Request.config_params("#{RAILS_ROOT}/config/ebay.yml", RAILS_ENV)

Passing the RAILS_ENV as the second param just ensures it will use the correct environment settings from the config file, if you’ve got different ones for development, production, test, etc. If not, or if you don’t tell it, it will just default to the production settings.

You can also (from the console, for example), set the inital configuration with a Hash

EbayShopping::Request.config_params({:app_id => "my_app_id", :default_site_id =>"3"})

The hash must provide the app_id you’re given by ebay, and can optionally provide the ebay affiliate info and your preferred default country (e.g. the UK in the above one). This can be overridden in individual requests, or if you just leave it out it will default to ebay.com.

Enough waffle. Explore the code over at github. The documentation still needs tweaking, but the test suite and code comments should explain it all fairly well. Plus there are some use examples on the post about the original plugin, which still stand. Patches and forks welcome — this is git after all we’re talking about.

Update:

Finally gotten around to adding the gemspec file which allows github to build the gem automatically. So now all you need to do  is the usual:

gem sources -a http://gems.github.com

sudo gem install ctagg-ebay_shopping

Written by ctagg

May 13, 2008 at 4:54 pm

Posted in ebay, rails, ruby

Tagged with , , , ,

Facebook templates made easy with Rails 2.0 custom Mime types

with 3 comments

I’ve written elsewhere about how I used my own lightweight library to add Facebook functionality to Autopendium :Stuff About Old Cars, the classic car community website I run.

The library has made it fairly easy to keep up with Facebook’s many changes, and the Facebook app has been a good marketing tool for the site. But adding more functionality to the app has meant duplicating code, as all the actions are handled by a FacebookController.

However, now that I’ve updated to Rails 2.0, adding Facebook functionality is a whole lot easier, and the solution is so simple, I’m sure it’s a common usage pattern.

Let’s take the Autopendium classic car events calendar, which we’ve just introduced:


I don’t really want to do an events action in the FacebookController just to make it available in the Facebook app; I’d rather just use the EventsController#index action and render it with a custom template. (We’re already doing something similar for ics MimeTypes — serving up the events in iCalendar format, so they can be imported directly into your electronic calendar, but that’s for another post).

Custom Mime Types to the rescue

With the new custom Mime types in Rails 2.0, it’s a breeze. Often these are used for customising apps for the iPhone (as shown here), but I reckoned the situation was pretty similar with integrating Facebook interfaces to existing apps.

If the request comes in via the the Facebook canvas we need all the custom Facebook FBML to specify style, etc (and even if you’re using an iframe, you’ll probably want a custom layout).

So this is what I did. First, add a custom facebook mimetype in your environment file:

 Mime::Type.register_alias "text/html", :facebook

Then you need to some way to recognize you’ve received a request from your Facebook app.

You could do a check on the params, as all requests from Facebook have a number of Facebook-specific parameters (fb_sig, etc, which is covered briefly in the second part of my Facebook lightweight library posts). This has the added advantage that you can use the normal URLs in your facebook templates/links. However, if you’re using REST-type routes — as I am — you may end up with difficulties for the moment, as all request from Facebook are POSTs. (According to FB, this may change in the future and already there’s a parameter in the request which says what method the original request was.)

A simpler way is to use the routes (or possibly a subdomain, as shown in the iPhone example). I’ve already got the Autopendium FB app set up so that all request from Facebook have a base URL of autopendium.com/facebook/ (i.e. what Facebook calls the callback URL) . This normally sends everything to the Facebook controller, so /facebook/latest goes to the #latest action in the FacebookController. However, if I add a couple of line to my routes.rb file:

map.connect 'facebook/:controller', :format => 'facebook', :action => 'index'
map.connect 'facebook/:controller/:id', :format => 'facebook', :action => 'show'

… I get facebook/events (which is generated by a link in the Facebook canvas of apps.facebook.com/autopendium/events routing to the index action in the Events controller with our custom :facebook format. Likewise facebook/events/3 will route to the show action in the Events controller with an :id of 3 in the params hash.

Then in my Events controller, I just add an additional line to the respond_to block:

respond_to do |format|
format.html # index.html.erb
...
format.facebook # index.facebook.erb
end

This means the response for the Facebook request will be served using a special facebook template, with no render :template => “special_facebook_template” needed.

Classic Car Events in Facebook

Even better, if it will automatically use a custom facebook application layout if it exists (called application.facebook.erb). So all your standard links, frame, css can be included without a single extra line. Mine looks something like this:

<%= stylesheet_link_tag "facebook_basic" %>
<fb:header decoration="add_border">Autopendium :: Stuff about old cars</fb:header>
<br />
<fb:tabs>
<fb:tab_item href="http://apps.facebook.com/autopendium" title="Intro">Intro</fb:tab_item>
<fb:tab_item href="http://apps.facebook.com/autopendium/show" title="My Autopendium">My Autopendium</fb:tab_item>
<fb:tab_item href="http://apps.facebook.com/autopendium/events" title="Classic Car Events">Classic Car Events</fb:tab_item>
</fb:tabs>
<div class="container">
<%= yield  %>
</div>

Written by ctagg

April 12, 2008 at 12:54 pm

the ebay shopping api and the new ebay affiliate scheme

leave a comment »

As you may have heard, started from April 1, eBay is phasing out its old affiliate schemes in favour of it’s own home-grown one.

I won’t go here into discussing the pros and cons of the change (for Autopendium, the classic car website I run, on balance it’s probably good, if only from an admin point of view), but I did think it’s worth mentioning how how to update your config file for ebay-shopping, the rails library I wrote for the eBay Shopping API.

Step 1: Update your ebay.yml file with the new settings

:production:
  :app_id: "your_api_app_id_code" # this doesn't change
  :affiliate_partner: "9"  # this is to signify you are using
eBay's own affiliate scheme and afer the end of April will
be the only working choice
  :affiliate_id:  "your_new_ebay_affiliate_code" # This is the
new affiliate code from eBay, and is also called a CampaignId
  :affiliate_shopper_id: "my_campaign" # Doesn't need to change

The CampaignID is the only tricky bit, as eBay sometime also refers to is as CampID and Tracking Partner ID. Once you’ve signed up for the affiliate scheme, click on the Campaigns tab to find it (you can actually have more than one campaign and hence more than one CampaignID).

Step 2: Restart your server. Er, that’s it.

Written by ctagg

April 4, 2008 at 6:48 pm

Posted in ebay, rails, ruby

Tagged with , , , ,

Quick fix: uninitialized constant Gem::CommandManager and gemsonrails

leave a comment »

Quick note to save someone a couple of hours of doing what I did, when I got an exception of “uninitialized constant Gem::CommandManager” when trying to freeze a gem for my rails project. Googling for the error led me to believe I’d got a problem with rubygems, so I tried everything in that area — updating rubygems, reinstalling it, updating to Ruby 1.8.6, blah, blah, blah.

Turned out all I had to do was:

sudo gem update gemsonrails # update the gem

gemsonrails # update the tasks in the gemsonrails plugin folder

Er, that’s it. Doh!

Doh

Photo courtesy of striatic

Written by ctagg

February 15, 2008 at 7:28 pm

Posted in rails

Tagged with

A RubyonRails library for the ebay shopping API

with 21 comments


After the lightweight Facebook library I wrote to scratch my own itch, a couple of days ago I started to look at adding ebay items to Autopendium :: Stuff About Old Cars, the classic car website I run. Users were already shown books from Amazon, appropraite to the content being shown on the page, and it seemed to make sense to show models, cars and parts from ebay, for the vehicle or model being displayed.

Amazon books on Autopendium

I’d had a look at adding ebay functionality quite a while back, when I’d first just started to use Ruby and Rails, and couldn’t quite get to grips with ebay4r, which was at that time the ebay API ruby library. Since I’d last looked, another library had been written, Cody Fauser’s ebayapi, which he introduces with a brief tutorial here, and having a quick look at the code and the tests, it seemed just the job. I then fired up IRB and and gave it a test drive in the console.

It all seemed fine, just rather slow. The problem is, the library uses ebay’s SOAP interface, which is markedly slower than the REST one. And in fact, even the Trading REST interface is slower than the Shopping interface, as a quick and dirty benchmark shows:


  user       system       total      real
0.050000    0.030000    0.080000  (  6.487812) # 10 calls to the shopping REST API
0.130000    0.060000    0.190000  ( 12.517658) # 10 calls to the trading REST API

Now, if you want all the functionality that the Trading API provides — the ability to bid on items, or to list new items — that speed trade-off is no problem, as the user will expect such things to take a couple of seconds.

But if you’re wanting to include items for sale on a page each time it’s displayed (even allowing for caching), each 1/10th of a second counts, and the extra functionality that the Trading API provides is irrelevant.

Unfortunately, there’s no Ruby or Rails library for the Shopping API. So, time to scratch my own itch again. Enter ebay-shopping, a RubyonRails plugin for the ebay Shopping API. It’s a pretty straightforward plugin that was fairly easy to write (the first version, implemented as a basic lib file, was done in an afternoon), and is even easier to use.

To install, from the root of your rails app simply run the usual

script/plugin install http://ebay-shopping.googlecode.com/svn/trunk/ ebay_shopping

Then run

ruby vendor/plugins/ebay_shopping/install.rb

This will copy a basic configuration file into your app’s config directory. This is where you put your ebay settings (Ebay Application id, affiliate info if you have it, etc). Update this with your settings — the only thing you actually need is the app id, which you can get by signing up at http://developer.ebay.com (The code you need is called the AppID — the Auth Token and other stuff is for the Trading API).

Then fire up the Rails console and away we go:


>> request = EbayShopping::Request.new(:find_items, :query_keywords=>"Cadillac")
=> #<EbayShopping::Request:0x246aa54 @affiliate_shopper_id="my_campaign", @affiliate_partner="1",
@site_id=nil,@affiliate_id="foo1234bar", @callname=:find_items, @call_params={:query_keywords=>"Cadillac"},
@app_id="my_ebay_app_id_1234567">

>> response = request.response
=> #<EbayShopping::FindItemsResponse:0x2444520 @request=#<EbayShopping::Request:0x244a36c,
@url="http://open.api.ebay.com/shopping?version=547&appid=my_ebay_app_id_1234567&callname=FindItems&QueryKeywords=Cadillac",
@affiliate_shopper_id="my_campaign", @affiliate_partner="1", @site_id=nil, @affiliate_id=nil, @callname=:find_items,
@call_params={:query_keywords=>"Cadillac"}, @app_id="my_ebay_app_id_1234567",
@full_response={"Version"=>"547", "Timestamp"=>"2008-01-13T13:20:27.641Z", "Build"=>"e547_core_Bundled_5879814_R1",
"Item"=>[{"ShippingCostSummary"=>{"ShippingType"=>"NotSpecified"}, "ListingStatus"=>"Active", "TimeLeft"=>"P20DT16H59M6S",
"PrimaryCategoryName"=>"eBay Motors:Cars & Trucks:Cadillac:STS", "Title"=>"Cadillac : STS",
..."ItemSearchURL"=>"http://search.ebay.com/ws/search/SaleSearch?fsoo=2&fsop=1&satitle=Cadillac",
"Ack"=>"Success", "TotalItems"=>"15580", "xmlns"=>"urn:ebay:apis:eBLBaseComponents"}>

>> response.total_items
=> 15580

To get the items from the response, just ask for them


>> first_item = response.items.first
#<EbayShopping::Item:0x2413a88 @gallery_url="http://thumbs.ebaystatic.com/pict/230212386614.jpg",
@all_params={"ShippingCostSummary"=>{"ShippingType"=>"NotSpecified"}, "ListingStatus"=>"Active",
"TimeLeft"=>"P20DT16H59M6S", "PrimaryCategoryName"=>"eBay Motors:Cars & Trucks:Cadillac:STS",
"Title"=>"Cadillac : STS", "ConvertedCurrentPrice"=>{"currencyID"=>"USD", "content"=>"9500.0"},
"GalleryURL"=>"http://thumbs.ebaystatic.com/pict/230212386614.jpg", "ItemID"=>"230212386614",
"ListingType"=>"FixedPriceItem", "EndTime"=>"2008-02-03T06:19:33.000Z", "PrimaryCategoryID"=>"124117",
"ViewItemURLForNaturalSearch"=>"http://cgi.ebay.com/Cadillac-STS_W0QQitemZ230212386614QQcategoryZ124117QQcmdZViewItem"},
@view_item_url_for_natural_search="http://cgi.ebay.com/Cadillac-STS_W0QQitemZ230212386614QQcategoryZ124117QQcmdZViewItem",
@end_time="2008-02-03T06:19:33.000Z", @primary_category_name="eBay Motors:Cars & Trucks:Cadillac:STS",
@converted_current_price={"currencyID"=>"USD", "content"=>"9500.0"}, @title="Cadillac : STS",
@item_id="230212386614", @time_left="P20DT16H59M6S">

The key attributes for the item are available through ruby-ized version of the ebay Attributes (full documentation for the Shopping API calls and responses)


>> first_item.title # for the Title attribute
=> "Cadillac : STS"
>> first_item.gallery_url # for the GalleryURL attribute
=> "http://thumbs.ebaystatic.com/pict/230212386614.jpg"
>> first_item.view_item_url_for_natural_search # for the ViewItemURLForNaturalSearch attribute
=> "http://cgi.ebay.com/Cadillac-STS_W0QQitemZ230212386614QQcategoryZ124117QQcmdZViewItem"
>> first_item.bid_count
=> nil
>> first_item.primary_category_name
=> "eBay Motors:Cars & Trucks:Cadillac:STS"

As you can see, most of these responses are just strings. For the price, you’ve got a couple of options


>> first_item.converted_current_price
=> #<EbayShopping::Money:0x1410b70 @content=9500.0, @currency_id="USD">
>> first_item.converted_current_price.content
=> 9500.0

or


>> first_item.converted_current_price.to_s
=> "$9500.00"

It’s also worth noting the end time is returned as a Ruby Time object, so you can do calculations against it


>> first_item.end_time
=> Sun Feb 03 06:19:33 GMT 2008
>> first_item.end_time.class
=> Time

Finally, there’s a catch_all [] method which allows you to access other attributes using a familiar hash key notation:


>> first_item["ShippingCostSummary"]
=> {"ShippingType"=>"NotSpecified"}

Other methods and usage are given in the code comments and the fairly extensive test suite (browse the source here). There are also hooks to allow for caching and (separately) error caching, which is necessary if you want to get your app approved as a Compatible Application, which allows you a greater number of API calls per day (I did). I’ll post on usage of these and examples if anyone wants me to.

Tie that into your Rails app, and you’ve got an instant mash-up:
Ebay items on Autopendium

At the moment, the library’s only available as a RubyonRails plugin, rather than a Ruby gem. The only reason for this is that it was extracted from a Rails app, and is slightly structured accordingly (e.g. the YAML config file, and option for different settings in different environments). However, it’s probably not a huge job to package it as a gem, or to use the code as is in a standalone Ruby app.

p.s. Some of the less frequently used API calls haven’t yet been implemented, but are being done bit by bit, and if anyone’s got a crushing need for one of the missing ones, let me know, and I’ll bump it up the priority list.

Written by ctagg

January 13, 2008 at 5:47 pm

Rails 2.0 gotcha: count_from_query plugin

leave a comment »

I’ve got a fairly comprehensive test suite for Autopendium :: Stuff About Old Cars, the classic car community site I run, which makes upgrades of the framework fairly stress-free.

By stress-free, however, I don’t mean trouble-free — there are going to be failing tests, and there are going to be problems. However, I’m fairly confident that if the tests pass, the update to the production server will be without problems (particularly since I’ve started using a staging server).

So it was with upgrading to 1.2.6 — I had only a couple of deprecation warnings, and some failing tests, most of which were due to some problems with my routing.

Upgrading to 2.0.2 has proved a bit trickier however, mainly because the error messages (and they are errors, rather than fails) aren’t helping me in finding the root cause of the problem, only telling me what ultimately brings the whole thing crashing down.

Running the unit tests via the console I get this horror:

Rails 2.0 unit test errors

OK. Let’s take this bit by bit. So I run the unit tests for WikipediaEntry:

Rails 2.0 unit test passes

Hmm. This smells… and the smell is called… fixtures.

Log story short, by trawling through the test logs, using ruby-debug, and getting to grips with the ActiveRecord code, I found out that the count_from_query plugin I use (which makes generating counts from complex custom finders a cinch) wasn’t playing well with the new ActiveRecord behaviour, which has changed a bit since the 1.2 branch, nor with the new faster fixtures.

[As an aside, this sort of thing is why if you’re serious about using Rails you must learn Ruby, and why it’s a good idea only to use lightweight plugins you can understand.]

The offending line in the plugin is towards the bottom, where the plugin’s methods are added to ActiveRecord::Base.

  def self.included(receiver)
    receiver.send :include, ClassMethods
    receiver.extend(ClassMethods)
  end

This method (or callback) is invoked when, to quote the Pickaxe, “the receiver is included in another module or class”. Thus, “receiver.extend(ClassMethods)” extends ActiveRecord::Base with the methods which are contained in the ClassMethods module.

OK, this makes sense, as the methods consist of the count_by_query method and a method_missing, which tests whether the called missing method ends with _count and calls count_by_query if it does. This means if you have a class method called #find_complicated_stuff you can call #find_complicated_stuff_count, which is great and makes will_paginate, for example, much easier in some edge cases.

The problem lies with the previous line: “receiver.send :include, ClassMethods”, which includes the code (and method definitions) in the ClassMethods module, which has the effect of including the methods as instance methods. This doesn’t work with Rails 2.0 for two reasons:

1) If we get a tag cloud, for example, which might look something like this:


    find(:all, {
          :select => "tags.*, COUNT(*) AS tag_count", 
          :joins => "INNER JOIN taggings",
          :conditions => "taggings.tag_id = tags.id", :group => "tags.name",
          :order => "tag_count DESC, name ASC",
          :limit => 10})

The resulting count for each of the returned tags can then be accessed through #tag_count. Except that this will be intercepted by the method_missing in the count_by_query plugin (not sure why this didn’t happen in 1.2 — perhaps to do with the load order of the plugin?)

2) When a failure occurs in the #count_from_query method because of how it works we’ve left #find in an unstable state, which means calling find (which the fixtures code does) results in doing a count. Result: kaboom!

For me the solution was to delete the offending line (looking at the specs this may be intentional behaviour rather than a bug, but I can’t really see a use for it, and certainly don’t need it).

Written by ctagg

December 30, 2007 at 7:08 pm

Solving the live-search/slow mongrel process problem

leave a comment »

I’ve been using the excellent nginx ever since I put Autopendium :: Stuff about old cars, the classic car community website I run, into production.

Now there’s another reason for using it, a sensible balancer that delivers requests only to those mongrels that aren’t busy. However, as Ezra Zygmuntowicz (whose post first put me on to nginx in the first place), says:

“Now we all know that you should not have actions that take this long in our apps right? Of course we do, but that doesn’t stop it from happening quite a bit in a lot of apps I’ve seen.”

However, one place this problem can easily occur (even if you haven’t got any requests that you’d consider as ‘long-running’) is live-search.

In all the standard Rails recipes for livesearch, it works by observing a text input box, and sending the query to the app each time the text changes.

This is all fine and dandy in theory (well, apart from the repeated requests to the server, which is somewhat wasteful), and works fine in development where you’ve only got a single mongrel handling the requests of a single user. However, start adding some more mongrels (or FCGI processes, or whatever), and you get into all sorts of problems.

First, there’s the problem that in the user’s eyes, a live-search only works if the response is pretty much instantaneous. If the request is served to a mongrel that’s already handling someone else’s request and that request takes a couple of seconds to complete, then it’s not working for them.

However, another problem is maintaining the order of requests and responses. Say, you’ve got a reasonably designed app, are using a cluster of mongrels and caching to ensure that no page takes more than say half a second to process.

Then chances are, if you’re using live-search the standard way, you’ll get some unexpected behaviour (from the user’s perspective).

An example: Your user is searching for information on the Volvo Amazon and start typing normal speed in the livesearch box: A-m-a-z-o . The whole thing takes less than a second, but what they see isn’t what they’d expect:

Livesearch example

Huh? How’d that happen? I type in Amazo, and get results for Ama? From the user’s point of view, it’s at best puzzling (and knocks the site down a notch in their eyes), and at worse useless (if they were unlucky they might have ended up with this):

Livesearch problem 2

The problem is, the mongrel that dealt with the request “/search/livesearch?term=Am” took a fraction of a second to get around to dealing with the request because it was still finishing off a previous request (the ‘dumb’ round-robin proxy delivering the requests to the mongrels will not know this, delivering requests to each mongrel in turn). Because of this it returned the response after the other mongrels had returned theirs.

How do you deal with this? For most CS graduates, this is probably a basic first-year problem, complete with the appropriate jargon. For me, a self-taught, greying old-car junkie, there appear to be three solutions:

  1. Make sure the requests are only passed to those mongrels who are free to deal with the request. This is what the fair proxy balancer for nginx and mongrel mentioned at the top does. The bonus is that this will improve the apparent responsiveness of your whole app. The only problem, I guess, could come if the later requests (i.e. those with more letters) take less time to complete than the earlier ones.
  2. Pass the requests to a faster backend server, one that isn’t handling more ‘meaty’ requests. After all, livesearch isn’t doing anything complicated, just searching the db, parsing and then returning the results. Perhaps this is a job for a custom mongrel handler or merb?
  3. Make just a single trip to the server, and then use local javascript to reduce the results on each successive letter.

Long story short, I went for 3. with Autopendium, but delayed doing so for quite a while because I thought it would be very difficult. Turns out not to be.

First, I cleaned up the code at the server end. No more templates, no more converting the results to HTML, just return the search results as JSON.

def live_search
    @results = Modtype.find_trusted( :select => "id, title",
       :conditions => [ 'LOWER(title) LIKE ?', '%' + params[:term].downcase + '%' ],
       :order =>"title").collect { |m| {:title => m.title, :id => m.id} }
    render :nothing => true, :status => 404 and return if @results.empty?
    render(:text => @results.to_json)
end

As you can see, we just return a 404 if there are no results (we’ll use that later).

Then in the application.js file you need to add the observer to the text input. I’m a great believer in unobtrusive javascript and use Dan Webb’s excellent Lowpro library which I’ve written about before:

Event.addBehavior({
    "#header-livesearch-term": function(event) {
        new Form.Element.Observer('header-livesearch-term', 0.5, liveSearch );
    }
})

You could also use Rails built-in observe_field in the layout or template to achieve the same result if you’re into that sort of thing. Something like:

<%= observe_field :suggest, :function: =>"liveSearch(element, value)",
:frequency => 0.5,
:with => 'term'  %>

Then you’ve got the javascript to make the Ajax call to your app, parse the results, and render them on to the screen.

Alert: I’m very much a javascript hacker, picking up bits as I go along, so use this at your own risk:

Towards the top of my application.js file:

modtypeSearch = null; // setup global variables for the live-search results
initialModtypeTerm = null;

Then, a little later:

function liveSearch(element, value)
{
// Don't do anything unless we've got at least two characters to search for.
// You can change or delete this.
    if (value.length > 1) {

    // Check whether we've already got search results.
    // Also check that the term those two letters was generated from
    // is still valid for what where currently being searched for i.e. 'am'
    // is good for 'amaz' but not 'mot'

    if (modtypeSearch && value.match(initialModtypeTerm) ) {
        var termRegexp = RegExp(value, "i");
        // Given we've got the basic results, search within those
        var subSetResults = modtypeSearch.findAll(function(n) {return n.title.match(termRegexp);});
        // Convert the results to a list of links
        var htmlResults = resultsToLinks(subSetResults,termRegexp);
        // Update the results div
        $('live_search_results').update(htmlResults);
        }
    else {
        //We've got no valid results already, so make a Ajax request to the app
        new Ajax.Request('/search/live_search', {asynchronous:true, evalScripts:true, parameters:'term=' + encodeURIComponent(value),
            onFailure: function(transport) {
                // if there are no results the app returns a 404. We can use that to display a no results message
                $('live_search_results').update('<p class="highlight">"' + value + '" not found!</p>');
            },
            onSuccess: function(transport) {
                modtypeSearch = eval(transport.responseText); // update the global modtypeSearch array with results
                initialModtypeTerm = value;  // and also the term that was used to find them
                var termRegexp = RegExp(value, "i");
                var htmlResults = resultsToLinks(modtypeSearch,termRegexp);
                $('live_search_results').update(htmlResults);
            }
        });
      }
    }
}

The resultsToLinks is just a simple function that converts the results array into a set of links, highlighting the search term using the em tag (I’ve CSS styled this to be highlighted with the standard bright yellow background):

function resultsToLinks(resultsArray,rExp) {
    var resultString = '<h4>"' + rExp.source + '" found in: </h4>\n<ul>' +
    resultsArray.collect(function(s) {return resultToLink(s,rExp);}).join("\n") + "</ul>";
    return resultString;
}
function resultToLink(r,term){
    var l='<li><a href="/modtypes/' + r.id + '">' + r.title.sub(term, '<em>#{0}</em>') +    '</a></li>'; return l;
};

And that’s pretty much it. The whole thing sidesteps the problems stated above, makes far fewer calls on the server, uses less bandwidth, and feels much faster to the user.

Written by ctagg

December 18, 2007 at 11:12 am

Follow

Get every new post delivered to your Inbox.