Hi,
I'm trying to load info from an xml file into a table. The xml structure is roughly as follows (the actual file is a bit more complex but simplified here for illustrative purposes):
- <xml>
- <record>
- <id></id>
- <title></title>
- <year></year>
- <author></author>
- <journal></journal>
- <abstract></abstract>
- </record>
- </xml>
It contains about 300 <record> entries, with each tag containing text content (the <abstract> tags can each contain up to a couple paragraphs of text). All up quite a bit of data.
I'm trying to:
a) parse each <record> into it's own row in a table, currently as follows:
- $.ajax({
- type: "GET",
- url: "ref.xml",
- dataType: "xml",
- success: function(xml) {
- $(xml).find('record').each(function(){
- var id = $(this).find('id').text();
- var year = $(this).find('year').text();
- var title = $(this).find('title').text();
- var auth = $(this).find('author').text();
- var journal = $(this).find('journal').text();
-
- $('<tr class="items visible searchable" id="link_'+id+'"></tr>').html('<td class="year">'+year+'</div>').appendTo('table tbody')
- $('<td class="author">'+auth+'</td>').appendTo('#link_'+id);
- $('<td class="title">'+title+'</td>').appendTo('#link_'+id);
- $('<td class="journal">'+journal+'</td>').appendTo('#link_'+id);
- }); //end each function
- } //end success function
- }); //end ajax call
and b) and filter table rows on-the-fly through a text input as follows:
- //Send search term to 'filterAll' function
- $('#filter').keyup(function(event){
- if (event.keyCode == 27 || $(this).val() == '') {
- //if esc is pressed we want to clear the value of search box
- $(this).val('').blur();
- //Make each row visible again
- $('tbody tr').removeClass('visible odd even').show().addClass('visible').css({backgroundColor: ""});
- $('.visible:even').addClass('odd');
- $('.visible:odd').addClass('even');
- } else {
- var searchTerm = $(this).val();
- if (t) clearTimeout(t);
- t = setTimeout(function(){filterAll(searchTerm)}, 200);
- }
- }); //end keyup
- function filterAll(phrase){
- var words = phrase.toLowerCase().split(" ");
- var table = $('#itemList tbody');
- var rows = table.find('tr.searchable');
- var ele;
- for (var r = 0; r < table.find('tr').length; r++){
- ele = rows.eq(r).html().replace(/<[^>]+>/g,"");
- for (var i = 0; i < words.length; i++) {
- if (ele.toLowerCase().indexOf(words[i])>=0){
- rows.eq(r).show().removeClass('visible').addClass('visible');
- } else {
- rows.eq(r).hide().removeClass('visible');
- break;
- }
- };
- //After filtering, restore zebra stripes to visible rows
- rows.removeClass('even odd').css({backgroundColor: ""});
- $('.visible:even').addClass('odd');
- $('.visible:odd').addClass('even');
- };
- }; // end filterAll function
Both a) and b) are essentially working as I want them to, but my problem is the browser slows down to a crawl on page load (usually throwing the 'a script is causing this page to load slowly etc' error), and it takes forever when filtering data. I'm not even touching the <abstract> dataset yet either.
I'm sure the efficiency of my code could be improved a hundred-fold, but I'm not sure whether that would make enough of a difference to make the page useable, particularly as I want to include <abstract> data later too. I'm wondering if I should be approaching the issue from a whole different angle (i.e. converting the xml data into a database, or using JSON?). This is my first foray into ajax, and I've been given the data in XML format. If anyone knows how to approach parsing and filtering through large amounts of XML data quickly, any assistance would be much appreciated.
PS - I've considered paginating the table to speed up the initial page load, but have been avoiding this as I still want to be able to quickly filter all rows on the fly and paginating wouldn't fix that particular issue.
Cheers