DoctorScar
19 Feb 2011, 01:19 PM
Hi all,
I'm in the process of updating my course websites (I'm a math professor) - mostly just for fun, but also to make them easier to maintain on a day to day basis. Anyway, right now I have a bunch of php scripts which read csv files and display relevant data (like announcements, the next homework assignment, etc) based on the current date and the dates listed in the csv file. I'm planning on moving to XML files or an SQLite database, but that's another story.
My question is about the best way to sift through the data. Right now, the php scripts run everytime a user navigates to the particular page. This means reading all the data into a 2D array, and the displaying the relevant entries. There are no problems with this - I don't have very many readers, and there isn't that much data. But I'm wondering if there is a better way.
Since the "current" data only changes twice or three times a week, it seems wasteful to parse all the data every time a user loads the page. Instead I could perhaps create a current_data.xml file the first time any user accesses a page since the most recent class period. So for example, if a homework set was due on Tuesday, the first time anyone browses to the site on Wednesday, a script runs updating which data is "current" and then I just use that data until the next class period. I would need to check the date each time the page is loaded, but I'm doing that anyway. The difference is I'm not parsing the data every time - just once.
Is this something worth pursuing?
Thanks,
Oscar.
I'm in the process of updating my course websites (I'm a math professor) - mostly just for fun, but also to make them easier to maintain on a day to day basis. Anyway, right now I have a bunch of php scripts which read csv files and display relevant data (like announcements, the next homework assignment, etc) based on the current date and the dates listed in the csv file. I'm planning on moving to XML files or an SQLite database, but that's another story.
My question is about the best way to sift through the data. Right now, the php scripts run everytime a user navigates to the particular page. This means reading all the data into a 2D array, and the displaying the relevant entries. There are no problems with this - I don't have very many readers, and there isn't that much data. But I'm wondering if there is a better way.
Since the "current" data only changes twice or three times a week, it seems wasteful to parse all the data every time a user loads the page. Instead I could perhaps create a current_data.xml file the first time any user accesses a page since the most recent class period. So for example, if a homework set was due on Tuesday, the first time anyone browses to the site on Wednesday, a script runs updating which data is "current" and then I just use that data until the next class period. I would need to check the date each time the page is loaded, but I'm doing that anyway. The difference is I'm not parsing the data every time - just once.
Is this something worth pursuing?
Thanks,
Oscar.