Separate names with a comma.
Discussion in 'Distributed Computing' started by Gilthanis, Dec 22, 2018.
nice start Tornlogic
As RFGuy stated, just make sure to join our team and you are good. All BOINC projects are [H]ard|OCP except WCG which is HardOCP. If you have not made an account at WCG yet, this link will associate you with the team automatically when joining https://join.worldcommunitygrid.org?recruiterId=338542&teamId=BP5XNJBR9N1
To be fair, most of those credits went to my 2 previous teams, Ultimate Chaos and U.S. Army. I've been a member of Ultimate Chaos since 2000 for other distributive computing projects. I'm was nearly the last member there doing anything, so it wasn't much of a team. I left that to join U.S. Army, thinking it would be a fun community, but it's dead too. So now I'm here, and I'm here to stay... as long as you guys don't go anywhere. You all seem like a fun bunch.
Also, my heart is in PrimeGrid. I won't be doing anything but that project.
Good to have you anyways. We hit a lot of projects through the year. Last few years we have focused a large bit on Formula-Boinc. This year we are in League 1 and will have some heavy competition. We also heavily hit the Pentathlon in May. The rest we just fit in between those challenges as we can. November and December is mostly dedicated to WCG. So, don't get discouraged if you are the only one hitting PG serious on some of the challenges. We just have too many projects to hit and PG has a ton of challenges.
Okay Gilthanis, that is good to know. Thank you for explaining that, as I had no idea of the intricacies of this team. Very glad to hear it's well developed.
The team is still in 17th place.
71 Tornlogic [H]ard|OCP 1 545 731.08 28
86 RFGuy_KCCO [H]ard|OCP 1 331 704.26 24
114 Coleslaw [H]ard|OCP 925 638.57 17
116 ChristianVirtual [H]ard|OCP 889 024.94 16
305 pututu [H]ard|OCP 105 440.15 2
We are soooo close to 16th place.. Hope we can pass Noobs Of Kryta [NOOB] before the end of the challenge!!
I have ~20 more WUs that should finish before the end of the challenge, so we should pass them, I think.
We moved up to 16th. Only 3 work units from 15th
Nice job guys! Great push at the end, thanks especially to RF. You slammed them out at the end!!
Thanks! Now that I know I can run PG LLR work on my TR rigs without CPU overheating issues, I will probably be an even stronger presence in the PG challenges going forward. That means the team should do even better next time.
Funny you mention that. I had the same problem with my one and only 9900k overheating up until this weekend. Turned out to be a bad H115i. I went out and got a new NZXT Kraken from Microcenter, and immediately went from thermal throttling at 3.6Ghz to 55C at 4.7Ghz Prime95 small FTTs. So the whole freaking time I thought I dorked the 7700k when I delidded it. Turned out it was a bad cooler. Funny too, because my temps went crazy about the time I delidded, so I didn't even think it was my cooler. I must have knocked some corrosion loose inside the loop when I was in there messing around with it.
The team finished in 15th place
60 RFGuy_KCCO [H]ard|OCP 2 629 412.38 47
79 Tornlogic [H]ard|OCP 1 831 192.17 33
108 Coleslaw [H]ard|OCP 1 300 861.98 24
142 ChristianVirtual [H]ard|OCP 940 733.84 17
334 pututu [H]ard|OCP 105 440.15 2
Did I ever mentioned I hate primes ? ... but glad we made it so far up
Well... February only has the one challenge on the books. So, a whole month of finding primes. The good news is that you can win special badges if you are really lucky...lol.
Looks like you're knocking out some serious LLR tasks for the Tour de Primes. Did you know what you have been the double checker for 4 primes?!?! That makes you 3rd place for 1st loser! WTF man?! We need to get you finding primes and being 1st on some of those winning WUs!!
I'm sure you know this, but I just need to verify, as that's what's in my blood. Don't cache WUs. If they sit on your system not being crunched, that gives others a chance to be 1st for that work unit. There are tips on how to keep only one WU in your task cache for each CPU/GPU you are running.
Also make sure you have the <report_results_immediately/> in your app_config.xml file (See below). That keeps them from sitting around on your system even tho they are complete, and having someone else get 1st on that WU.
I'm not trying to insult your intelligence if you already know this. I'd rather be safe than sorry as I just hate to see a team member be wingman on 4 freaking primes in such a short amount of time if it can be prevented with simple client tuning.
And hopefully some of this information might help someone else.
By the way. I know I'm not going to get any primes during this event, so I'm living vicariously through you!! Do me proud!!
For Reference: List of Double Checkers for Tour de Primes 2019-
Thanks for the advice, although I was already doing pretty much everything you suggested. I did have a larger cache than I should have, so I have now lowered it to just .1 day on all of my machines. I may look into going all the way down to only one WU in cache at a time, but not sure I'm quite that dedicated to prime finding. That said, it definitely would have been nice to be first on those four primes I double-checked.
Anyway, given the amount of resources I am now throwing at this competition, it is just a matter of time before I find some primes first. I think the smaller caches will help, so thanks for the tip!
Congrats RFGuy! You got your first 2019 Tour de Primes Prime, a GFN-16!! I feel like a winner already... living vicariously though you.
Now I’ve gotten greedy and have switched over to the Genefer 17 Mega tasks on my GPUs, in order to try to find a Mega-Prime.
That's exactly the tactic I would employ. Problem is I'm not finding anything
Me, either. Not yet, anyway.
Congrats, Tornlogic, for finding your first TdP Prime today! A little bigger than the one I found last week, too! Nice!
These primes are hiding pretty good this year, tough to find.
Only shot at a jersey now for me is going big or stay home. Just a little over a week to go. Good luck!
Chinese year of the pig started, good luck. GCW tasks for this one.
In the 3rd PG challenge (TRP LLR), we are in comfortable 5th spot with 3 more full days to go.
4_____Sicituradastra._________12 554 565.39__2 756
5_____[H]ard|OCP____________9 423 947.89___2 079
6_____Crunching@EVGA______6 063 054.98 ___1324
Rank | Name | Score | Tasks
4 EXT64 4 254 683.95 940
7 RFGuy_KCCO 3 199 523.37 705
26 ChelseaOilman 1 290 820.53 286
73 pututu 447 406.43 97
85 Tornlogic 366 677.07 81
May stop once I got my gold TRP badge which I haven't obtained prior to this challenge. P.S. I'm only running one xeon rig here as the CPU gets pretty hot.
I just don't look at temps, lol. Really though my stock clock Xeons don't get too hot. The Phi running AVX512 did get the hottest I've seen it in the upper 50's C.
Threw a few systems on for shits and giggles. Probably way too little, way too late, but what the heck.
It will love your newer systems, thanks!
This just arrived today; but BOINC won't max out all the cores and I don't have the time to fight with it anymore. (Thought I could just install a second instance of BOINC... but guess not?)
Not sure why some BOINC projects have issues with high core count systems and others don't. Frustrating.
Might be easier to run projects with multi-threaded work units (like PG challenge!, lol). Things definitely go wonky with the Phi if I try to push it to maximum thread counts (68c x 4 threads), though some of that is just the Phi being a Phi.
This is with the multi threaded setup. Won't go above 47% or so. On some of my more... tame... 4Ps I tricked BOINC until the box was running around 100% usage. However, on these 100+ "core" systems I keep running into this issue. If the challenge had just started I'd simply run two VMs to each consume ~50% of the resources each.
The team has done an excellent job in this challenge. Looks like we should finish a strong 5th place, which is a tremendous showing for us in a PG challenge. Thanks to all who crunched, especially EXT64, who really killed it!
Individual Standings (6 hours to go until the end of the challenge):
Rank .....Name......................Team .............. Score ...............Tasks
4............EXT64 ...................[H]ard|OCP ......7 003 806.36 .....1 529
6............RFGuy_KCCO .......[H]ard|OCP ......5 871 203.16 ......1 275
28......... ChelseaOilman .... [H]ard|OCP .....2 218 624.50 ........486
97......... Tornlogic ..............[H]ard|OCP .....607 747.41 ...........133
100 ....... fastgeek .............. [H]ard|OCP .....570 157.95 ...........123
102 ....... pututu.................. [H]ard|OCP .....547 682.64 ...........118
The next two challenges in this series is also CPU based just in case anyone is planning ahead.
4 15-20 July 20:17:00 PPS-LLR 50th Anniversary of the Moon Landing Challenge 5 days Individuals | Teams
5 3-10 August 00:00:00 ESP-LLR Lennart Vogel Honorary Challenge 7 days Individuals | Teams
yep, LLR too. Planning to work on optimizing (HT and multithread) between now and then
If anyone is interested, the PG 2019 overall team and individual stats are available here (link can also be found in the PG main challenge series page), including historical stats.
Our best finish was in 2016 at 11th place. My best finish was in 2016 at 77th place.
2019 Team stat. [H] is currently at 13th spot.
2019 Individual stat.
In this year series and the past two years, the top three spots are been shared between CNT, SG and AtP.
Perhaps the team should decide if they would like to focus on PG series for 2020.... I know many members are not interested in primes but it would be a change up to the constant stress of bunkering/team hopping drama all the other challenges bring.
This PG challenge is pure CPU/GPU power play. No other special technique is required other than starting the race when the gun goes off. No need to worry about wingman or what not.
Good challenge but like you said not many members are diehard PG followers, well maybe except tornlogic
Yeah...just figured with all the drama that went down recently and the PTSD of the Pentathlon...lol People might be more interested in a cool down of sorts. It is still early to be discussing next year's plans but wanted people to keep it in mind.
I'm all for it!
I'm certainly planning to continue having fun with it this year. 2020 is a long ways away, but certainly something I'd consider.