I love having to program something I have never done and having to figure out how the hell to do it. Man it's nice to be able to work symbolically with arguments I put into things like:
document.getElement.ByID(div+i).innerHTML=something.responseText;
and I can fill in a bunch of divs at one time.
Now I will go listen to MiniPop and have their beautiful melodies haunt me.
Showing posts with label C#. Show all posts
Showing posts with label C#. Show all posts
Monday, September 14, 2009
Sunday, September 6, 2009
Almost Social Networking
I got all excited and decided that I could make my own Google Wave. I started writing code yesterday afternoon and I have a chat client...that works like a very simple chat client. I used AJAX type stuff to post the text in divs. The way(this is ultra tricky, I know) that data is passed between instances is by saving the text to a common text file which is then periodically viewed by the Javascript and posted. It's ultra low tech but it does work. I wonder if I could do it with a web service. Almost certain I could. I learned a lot and it was fun too. Crazy. The thing is web-based, by the way.
Friday, September 4, 2009
Job Search and New Page Idea
I'm officially in the job hunt now, and I'm looking for volunteer opportunities because my resume isn't exactly overflowing with development experiences. I program in C#, ASP.Net, JavaScript, AJAX.
I'm thinking of a next project--it would be social networking in a sense, but musical. The idea is that two or more people will rate a piece of computer generated music. What ever piece is rated highest (of a group of 5) will move to the next round, be mutated, then voted on. The crux is that people will get to do it "together". It would be interesting to see how ratings are different depending on who a person is paired with.
Monday, August 31, 2009
Regex Pt. II
For some reason my page was transferring to client side when it should not have been at all. To solve the problem I put
myString.match(/^\d+$/)
in JavaScript to determine if a string had any non-numeric values. I didn't want to be passing them to my c# routines. Was getting some nasty errors, boy. I'm still figuring out why my
javascript:CallMe
was being executed at all. I'll think on it.
myString.match(/^\d+$/)
in JavaScript to determine if a string had any non-numeric values. I didn't want to be passing them to my c# routines. Was getting some nasty errors, boy. I'm still figuring out why my
javascript:CallMe
was being executed at all. I'll think on it.
Saturday, August 29, 2009
More User Friendly
I found some nice code to use for additional popups--nice simple popups that really make the page easy to understand. I may implement them tomorrow. I also downloaded an SDK for a phone that I just bought. The world of Linux??? Maybe not so much. I think the IDE is pretty distant from the Penguin.
Tuesday, August 25, 2009
more improvements
So I have simplified my page and it works better now. I have also added some instructions in the css popup that direct the user a little better. Almost ready for the world, it is. Got a G1, trying it out today. Weird change from that other smartphone I had.
Saturday, August 22, 2009
Wow what I have done
I've had such a productive summer. Mainly I figured how to use pagemethods/webmethods for passing params back and forth between c# and javascript. I found out that you can't really intercept and have your way with the passed param--it's kinda untouchable until you send it back to c#. I added a little css. I got my postback(there is a section that posts back) to end up in the correct spot using a method in page_render and an anchor. I figured out what task is okay to use a postback and what ones can be ajax'd. Also learned alot about the crazy way that validation is implemented. Right now I think I have the validation pretty good where no stray values are being input. Oh yeah, I learned how valuable javascript is.
I found out how to make a textbox have a finger mouse pointer onhover. That was cool. I also figured out what needs to be done for uploading a project to a server. Currently my project is at www.tobymgraves.com and the only problem that is occurring is due to the server not saving my pictures immediately. I am constantly working to improve the page. I've tested it in IE, Firefox, Chrome, and Safari and Camino on a Mac. Also Safari on iPhone.
Now time to look for a job.
I found out how to make a textbox have a finger mouse pointer onhover. That was cool. I also figured out what needs to be done for uploading a project to a server. Currently my project is at www.tobymgraves.com and the only problem that is occurring is due to the server not saving my pictures immediately. I am constantly working to improve the page. I've tested it in IE, Firefox, Chrome, and Safari and Camino on a Mac. Also Safari on iPhone.
Now time to look for a job.
Friday, June 12, 2009
Success
I've succeeded in getting the username from a cookie, automatically accessing the DB to find the user's favorite drawing of a cellular automata. I can also draw the picture without a postback--it happens right on the page. Also I have the code assign a new usernumber if the person has not been to the site, or if they haven't been there in a long time. Phew. Took a lot of work.
Right now I am writing the code for the half of the page that evolves digital circuits from scratch until the fulfill the truth table. Find out it's not super easy to convert a 1 to boolean without doing a few steps in between. That's about ready to run.
Right now I am writing the code for the half of the page that evolves digital circuits from scratch until the fulfill the truth table. Find out it's not super easy to convert a 1 to boolean without doing a few steps in between. That's about ready to run.
Monday, May 25, 2009
Integration
I have succeeded in triggering an event on a page written in C#, then passed parameters to a Javascript/Ajax function, passed the params to a C# function, back to a Javascript function and ended the program. Now I intend to do my cellular automata/evolutionary algorithm page in Ajax, getting data from the user via cookies, retrieving the data in the background processes, and drawing/retrieving a picture in the background processes. I need to find out how to draw a pic without opening a new window. I may perhaps use the UpdatePanel directive.
Tuesday, May 5, 2009
AJAX
Since I have a lot of time on my hands at this time, I'm getting back into studying code. Like multiple hours daily if possible. Right now I'm implementing AJAX to make a page that deals with AI concepts. Half the page is cellular automata, the other half is evolutionary algorithms. The idea is to calculate and retrieve data behind the scenes. I found a way to do it, but I'd rather come up with my own way, which does not employ ScriptManager.
Sunday, January 11, 2009
Cloud pt II
So I changed the use of labels in my tag cloud to the use of hyperlinks. I decided to make it work by using a query string which contains the text that you click on. As a result, when the page is reloaded, the term you clicked on is used as a search term. Right now I have to deal with the postback/pageload logic to make sure it doesn't try to search for a null term. The hyperlinks are left justified now but I will figure out a way for them to fill up some container in a nice orderly fashion.
Saturday, December 20, 2008
Cloud
Tonight I'm working on a tag cloud based upon the number of db entries for terms. I use row variables, add them up for each row, sort in ascending order. Then I display the top ten terms inside a panel with label.Font.Size = value for each one's row. So I get size relevancy to the number of times a term was searched. In the future I will have to come up with a scaling factor because I can't have 567 point text.
I actually had to use a decrement in the for loop because I was "rolling down the stack" to make room for the newest highest searched entries.
I actually had to use a decrement in the for loop because I was "rolling down the stack" to make room for the newest highest searched entries.
Friday, December 5, 2008
Last Night
Last night I was working on the parse portion of the search engine. I'm getting rid of words like "and" "or" and "the". I don't need them, yet, for searches. I was also getting bad results because I was not looking at URLs when there was no second term in the search query. Basically I forgot to use "" instead of null, so things got messed up.
Wednesday, November 26, 2008
Favicon is Evil
I am working on the spider again tonight. I learned a few things. Like HTTPWebRequest.timeout exists. And that favicon.ico is a pain in the butt, it always comes up and stops the spider. I've been using Regex to parse strings that have chars I just don't want to be in URLs.
I was reading about other people making web spiders and they are just out grabbing links, while I am recursively searching links to other links. As such, I am verifying that these pages get responses. I don't travel to them if I can't. Although I use a page that provides random links as a starting point, I'm following them as far as I can go.
some stuff:
wreq.Timeout = 60000;
Regex r = new Regex("favicon");
if(r.Success)
etc...
I was reading about other people making web spiders and they are just out grabbing links, while I am recursively searching links to other links. As such, I am verifying that these pages get responses. I don't travel to them if I can't. Although I use a page that provides random links as a starting point, I'm following them as far as I can go.
some stuff:
wreq.Timeout = 60000;
Regex r = new Regex("favicon");
if(r.Success)
etc...
Tuesday, November 18, 2008
Pretty much
Well the spider is running pretty much by itself. Used some conditional breakpoints for debugging today. Was dealing with stringURL getting too long to be stored in a database record, basically I am truncating it. I have rearranged the code so that I don't click a button to find each new URL but instead it runs in a loop. I actually went out to dinner, came back, and it was still finding URLs.
Thursday, November 13, 2008
Saturday, November 8, 2008
Thursday, November 6, 2008
Searchin'
I'm gonna move back to the search engine part of my project, but first...I had an idea for collecting pages to mine. If I increment through IP URLs, and look at the response each gives I can categorize them and store in a DB. This gives a big basis for looking around without running into a wall.
So I changed the part where:
if(affinity>=highest)
to:
if (affinity>=highest)
{
highest=affinity;
affinityTwin=whatever;
}
because I made the rookie mistake of not changing affinity to the current highest value when trying to find the highest of all values in a list.
So I've got the AI components going on. It's a little bit Fuzzy Logic and a little bit Neural Net.
More on this later.
So I changed the part where:
if(affinity>=highest)
to:
if (affinity>=highest)
{
highest=affinity;
affinityTwin=whatever;
}
because I made the rookie mistake of not changing affinity to the current highest value when trying to find the highest of all values in a list.
So I've got the AI components going on. It's a little bit Fuzzy Logic and a little bit Neural Net.
Friday, October 31, 2008
Yah, spider
Ok, I've got my spider spidering around the internet, grabbing text, the title, and links from web pages. These get stored as database records which are searchable by my search engine. Right now I'm getting exceptions when the spider grabs too much text for the DB to handle(and I know how to fix this:truncation) and when the spider runs out of pages to look at(I know how to fix this--secret).
And if you want to taste something good...put canned salmon on a cracker that has havarti spread on it. This is heavenly.
And if you want to taste something good...put canned salmon on a cracker that has havarti spread on it. This is heavenly.
Tuesday, October 28, 2008
eager
I'm eager to get back to programming. I'm gonna work on the web spider for awhile, perfect it. I took the weekend off and just relaxed...well, I did study a bit. I want to work on the recursive algorithm(only recursive in an implicit sense), and make sure the spider can go off on its own to keep looking for pages...so I can fill that database.
Subscribe to:
Posts (Atom)