Star Wars Jedi: Survivor

you don't need to buy the game to realize that it has issues...that's the entire point of reviews!...reviewers tell people about the gameplay and performance!...I'm not saying you need to trust every review site in the world but for me personally I have a few sites that I trust...Digital Foundry is one of them and is the best at technical breakdowns and analysis...I'm not sure why you're trying to tell everyone that the game has little to no performance issues when it's obvious it does

this was one of my most anticipated titles of 2023 so I'm disappointed...I already have a copy of the game but I'm going to wait for a few more patches...too many people have some kind of obsession with playing games on Day 1 no matter what condition the game is in...I have plenty of other games in my backlog along with other interests besides gaming to keep me occupied

again too many people think that just because you can technically play the game that it somehow makes the game great...even the worst performing games are technically playable on Day 1 (Cyberpunk 2077, Arkham Knight PC etc) if you're willing to compromise or ignore the major issues...I can lower the graphics settings to Low quality or lower the resolution to 1080p and call it 'playable' but that is not the ideal gameplay experience...I want my first playthrough to be the best it can be and if that means waiting a few weeks then I'm fine with that
Jesus dude, you have an RTX 3080. Really good chance the game is very playable for you.

If you demand 120 FPS at 4K I'm guessing it's never gonna happen. I suspect your 1440P Experience would be fine (depending on what you call fine).

People on here with worse systems have been playing the game and enjoying it. There is a guy on here that is running it on a 2080Ti with a lower end system and getting 40+ FPS at medium or better settings.

You can wait, but the optimization will be a long road and it's never gonna end up being what you're hoping for if you're holding off with the rig you have that is more than capable of playing the game well.

You should be able to dial the settings in to get 60+ FPS in 1440P, easily. It's not a competitive shooter so you don't need 144 FPS... Never gonna see that either, no matter how many times they patch the game.

I averaged 56+ FPS at 4K all setting maxed + RT. Not gonna get much better than that. I played it until I got burned out on the jump puzzles and I'm still firing it up to screw around with it here and there.

You can wait, but there will be a bunch of other stuff out by then. Like Diablo 4 (which was awesome in beta and makes this game look like clownshoes overall).
 
Exactly. There were a bunch of people who did not have many issues with CP2077 on launch. But obviously that game had lots of problems on release and to claim it didn't would just be completely dishonest or willfully ignorant.
Again, I had no problems running Cyberpunk when it came out at everything on High + RT on my 2080Ti at the time. I suspect, if you want to play bleeding edge games when they come out at acceptable FPS you have to have the right hardware. Maybe I'm the outlier, but the 2080Ti wasn't even the best card out there when Cyberpunk Finally launched. Pretty sure the recommendation was the 3000 Series Nvidia cards.

Even my cousin was playing the game on a 2060 Super and getting very playable FPS at 1080P on an i5-8600K... So, I really have to take all these "ABYSMAL PERFORMANCE!!!!!" Claims with a shitload of salt.

Guess it depends on what you consider to be playable.
 
the map was horrible in Fallen Order (amazing game but terrible map)...I heard they improved it in Jedi: Survivor
FYI - when you do get in the game the menus are dogshit. You need to set em, save em, quit and restart the game. Then they will take effect.

The Map was... MEH. I found myself struggling with just like in the first one. I had a lot of issues pinpointing my location vs where I had to go. It's not all that intuitive. And I don't even know why it's in the game. I almost never used it due to the lack of clarity.
 
the map was horrible in Fallen Order (amazing game but terrible map)...I heard they improved it in Jedi: Survivor
I can confirm its a little better but having to use WASD to move around is fucking retarded. It takes a concerted effort to actually place the cursor on a point of interest and do anything with it. Also, the map does a very poor job of handling verticality and showing how areas connect to one another. The map is fine until you need to transition to another area. Like in the first game, it's completely unhelpful in determining how to do that.
 
I can confirm its a little better but having to use WASD to move around is fucking retarded. It takes a concerted effort to actually place the cursor on a point of interest and do anything with it. Also, the map does a very poor job of handling verticality and showing how areas connect to one another. The map is fine until you need to transition to another area. Like in the first game, it's completely unhelpful in determining how to do that.
That's exactly what I saw when using it. I almost never referred to it as a result.
 
Again, I had no problems running Cyberpunk when it came out at everything on High + RT on my 2080Ti at the time. I suspect, if you want to play bleeding edge games when they come out at acceptable FPS you have to have the right hardware. Maybe I'm the outlier, but the 2080Ti wasn't even the best card out there when Cyberpunk Finally launched. Pretty sure the recommendation was the 3000 Series Nvidia cards.

Even my cousin was playing the game on a 2060 Super and getting very playable FPS at 1080P on an i5-8600K... So, I really have to take all these "ABYSMAL PERFORMANCE!!!!!" Claims with a shitload of salt.

Guess it depends on what you consider to be playable.
I had the RTX 2080 Ti when the game launched and I can only conclude you were not running the game at 4K as the RTX 2080 Ti was NOT capable of delivering a great experience in that game with ray tracing at that resolution.
 
I had the RTX 2080 Ti when the game launched and I can only conclude you were not running the game at 4K as the RTX 2080 Ti was NOT capable of delivering a great experience in that game with ray tracing at that resolution.
I ran at 4K, not max settings and I dropped RT after I realized it wasn't delivering much and it was killing my FPS. I was averaging 40+ FPS, same as always on most titles at 4K
 
Honestly, Dan_D I haven't really been able to play 4K titles at max settings until I got the 7900XTX. My 6900XT was an ok uplift over the 2080Ti but I never really cared for it. I ended up giving it to a buddy who had nothing, should last him years.

Since I have been running game on a 4K locked to 60 Hz TV there has been little reason for me to worry about FPS past that. Looking at getting at higher refresh 4K display coming up here but I only paid 200 bucks for the TV im using and it looks pretty damn good overall. So, not in a rush for a 120 Hz version
 
I ran at 4K, not max settings and I dropped RT after I realized it wasn't delivering much and it was killing my FPS.
Well, you said you ran it on high with ray tracing. I ran everything on Ultra as I won't turn down graphics settings unless I have no choice. it was playable although barely so. Turning down visuals or dealing with sub-60FPS performance doesn't qualify as a great experience in my mind. As for ray tracing I'm not sure how you can say that since the difference is significant. People were absolutely right in saying that the game ran like crap, though at least it delivered amazing visuals that mostly justify the performance. Obviously the game runs better now and we have better hardware available today than we had when the game launched.

I think the rub here is that while Jedi Survivor does look great, it doesn't look good enough to justify the performance problems. While I think some people expect too much thinking that there is always some secret optimization that will allow a game to look great while running on a potato, the fact is that it could run better than it does while delivering the same visual fidelity. It's obvious the developers did a shit job porting the game to PC.
 

yeah, here's the rig he was running it on:
CPU - Intel 13700KF With MSI MAG Core Liquid 360R V2 ♦ GPU -GIGABYTE GeForce RTX 4090 GAMING OC ♦ RAM - CORSAIR Vengeance 32GB CL36 6400MT's ♦ Mobo - ASUS Prime Z-690 P DDR5 (BIOS 2212) ♦ NVME - ADATA XPG GAMMIX S70 Blade Read:7400MB/s; Write: 5500MB/s ♦ DSP -Samsung 85-inch QN85B Neo QLED 4K UHD HDR 24X Dolby Atmos Gaming Smart TV ♦ PSU - EVGA GQ 1000W ♦ CASE -Phanteks Eclispe 500A
 
I'm not sure what you mean?
This is what I mean. Sure more GPU's were getting better results then I listed. Most people using a 4090, 7900XTX etc. are not playing at 1080p med where all the difference from the patch gains the most performance is what I was saying.

1683129862681.png


1683130023529.png
 
Honestly, Dan_D I haven't really been able to play 4K titles at max settings until I got the 7900XTX. My 6900XT was an ok uplift over the 2080Ti but I never really cared for it. I ended up giving it to a buddy who had nothing, should last him years.

Since I have been running game on a 4K locked to 60 Hz TV there has been little reason for me to worry about FPS past that. Looking at getting at higher refresh 4K display coming up here but I only paid 200 bucks for the TV im using and it looks pretty damn good overall. So, not in a rush for a 120 Hz version
I could max out most games at the time with my RTX 2080 Ti until Cyberpunk 2077 came out. Sure, there was room for improvement on some of the games I was playing but I got a good experience overall. However, I would be in a rush to get a 120Hz capable display if I were you. Trust me, you can get more than 60FPS in a lot of games and the difference is night and day.
 
Well, you said you ran it on high with ray tracing. I ran everything on Ultra as I won't turn down graphics settings unless I have no choice. it was playable although barely so. Turning down visuals or dealing with sub-60FPS performance doesn't qualify as a great experience in my mind. As for ray tracing I'm not sure how you can say that since the difference is significant. People were absolutely right in saying that the game ran like crap, though at least it delivered amazing visuals that mostly justify the performance. Obviously the game runs better now and we have better hardware available today than we had when the game launched.

I think the rub here is that while Jedi Survivor does look great, it doesn't look good enough to justify the performance problems. While I think some people expect too much thinking that there is always some secret optimization that will allow a game to look great while running on a potato, the fact is that it could run better than it does while delivering the same visual fidelity. It's obvious the developers did a shit job porting the game to PC.
No arguments on the port.

The console version is too much for the consoles to handle. I kind of prefer this because it means that a bad ass PC should be able to make the game shine. I despise that nearly everything is made for a console, first and ported to PC later.

Even after the (latest) patch the game is maxing out at 60 FPS with everything turned on (on the 7900XTX) so I got about the best experience with the game out the gate.

Nobody makes bug free games anymore. I don't recall the last time I ran into a bug free piece of software, game or application.
I could max out most games at the time with my RTX 2080 Ti until Cyberpunk 2077 came out. Sure, there was room for improvement on some of the games I was playing but I got a good experience overall. However, I would be in a rush to get a 120Hz capable display if I were you. Trust me, you can get more than 60FPS in a lot of games and the difference is night and day.
I will take that under advisement and see what I can do. Is that Aorus 48" 4K display any good? Or is there a better option?
 
Running just fine on high everything for me at 4k. Didn't check framerate but it's been smooth. I turned off motion blur and the other camera blur or whatever, and didn't bother with raytracing because it's not good on ue4 anyway.

Using an i9-13900k at 5.7, ddr5-6000 cl30, rtx 3090 for 4k. NVMes boot/game drives

Worked great on my other system at 1440p, i7-8700k, ddr4-3200, rtx 2070. NVMes
 
yeah, here's the rig he was running it on:
CPU - Intel 13700KF With MSI MAG Core Liquid 360R V2 ♦ GPU -GIGABYTE GeForce RTX 4090 GAMING OC ♦ RAM - CORSAIR Vengeance 32GB CL36 6400MT's ♦ Mobo - ASUS Prime Z-690 P DDR5 (BIOS 2212) ♦ NVME - ADATA XPG GAMMIX S70 Blade Read:7400MB/s; Write: 5500MB/s ♦ DSP -Samsung 85-inch QN85B Neo QLED 4K UHD HDR 24X Dolby Atmos Gaming Smart TV ♦ PSU - EVGA GQ 1000W ♦ CASE -Phanteks Eclispe 500A
You saying the rig is bad,I could buy a better one but still will not fix the stuttering mess.I like the game played like 17 hours,just would like not to be bottlenecked at 4K Max settings.
 
You saying the rig is bad,I could buy a better one but still will not fix the stuttering mess.I like the game played like 17 hours,just would like not to be bottlenecked at 4K Max settings.
No, dude. I was simply posting the rig specs so people knew how he (you?) was getting 8K gameplay
 
This is what I mean. Sure more GPU's were getting better results then I listed. Most people using a 4090, 7900XTX etc. are not playing at 1080p med where all the difference from the patch gains the most performance is what I was saying.

View attachment 568006

View attachment 568007
Ah ha ;)

I guess it makes sense. The non-apology letter from the development team, seemed to imply that they would work on making the game more forgiving with CPU performance. And it seems that is what they did.
 
For what it's worth, I have an AMD CPU (79050X3D) and an Nvidia card (RTX 4090) and the game stutters like a mofo for me. That's in 4K with things set to high, but no RT. The Digital foundry video that showed the various stutters last week mirrors what I see right now even after the patches. I haven't tinkered around with all the settings, but it didn't seem like FPS was the problem. My framerate was through the roof, the game just kept hitching.
 
For what it's worth, I have an AMD CPU (79050X3D) and an Nvidia card (RTX 4090) and the game stutters like a mofo for me. That's in 4K with things set to high, but no RT. The Digital foundry video that showed the various stutters last week mirrors what I see right now even after the patches. I haven't tinkered around with all the settings, but it didn't seem like FPS was the problem. My framerate was through the roof, the game just kept hitching.
For shits and giggles have you tried disabling the CCD that doesn't have the V-Cache on it? Curious if it would fix the issue
 
For shits and giggles have you tried disabling the CCD that doesn't have the V-Cache on it? Curious if it would fix the issue

Haven't yet. The fact that the DF video had the same hitches in the same exact places (with another setup entirely) made me kinda wonder if there was any point. If the next couple update patches don't change anything I'll probably start trying whatever I can.
 
Haven't yet. The fact that the DF video had the same hitches in the same exact places (with another setup entirely) made me kinda wonder if there was any point. If the next couple update patches don't change anything I'll probably start trying whatever I can.
Do not take 1 reviewers experience as evidence of your issue. If I recall they used an Intel system (Which has been proved has issues with E-Cores). Its possible that the game is having communication issues between the 2 CCD's. Disable 1 CCD to see if the issue is fixed.
 
There seems to be a disconnect in understanding "yeah, game is a shit port and has a list of issues, but here's some ways to at least make things better until (hopefully) real fixes come out" and "All reviewers say this game cannot run good at all. There can't be any way to at least mitigate it because the reviewers didn't find (try) any". Bonus points for insinuating that AMD paid for the game to run like this.
 
Welcome to modern gaming. Where Intel sponsors Total War Warhammer 3's latest patch and that game still has reported issues with e-cores a year after release.
Wow really thats still an issue? lol
 
if the end user needs to disable E-cores or disable CCD's to get the game working properly then something is wrong with the game and it needs to be fixed...
And it would make it the only game I know of that has a performance problem such a stuttering with those e cores. There are only a handful of games that even lose any performance and it's usually just a few percent. But overall, those efficiency cores give the same or better performance and in most cases make the overall experience smoother. That's not even factoring what they do outside of gaming. So no way in hell I'm going to disable them for a single defective game even if that was the fix. And I'm still calling BS on that so-called fix because again there is report after a report after report from very trustworthy review sites that have the same exact stuttering on CPUs that do not even use efficiency cores.

Anyway, I decided to pick the game up and will test it out myself tonight. I noticed it's usually the same people that say this game runs fine also said they didn't have any stuttering in the first game. Anyone that says the first game didn't stutter for them has absolutely zero credibility as they are beyond oblivious. I have played the first game on numerous configurations over the years and it stutters in the exact same places every single time and is easily reproducible. In fact, I used to get into the debates on the steam forums and not one goddamn person could prove me wrong when I said show me a video in that exact area that I would give for an example.
 
And it would make it the only game that has a performance problem such a stuttering with those e cores. There are only a handful of games that even lose any performance and it's usually just a few percent. But overall, those efficiency cores give the same or better performance and in most cases make the overall experience smoother. That's not even factoring what they do outside of gaming. So no way in hell I'm going to disable them for a single defective game even if that was the fix. And I'm still calling BS on that so-called fix because again there is report after a report after report from very trustworthy review sites that have the same exact stuttering on CPUs that do not even use efficiency cores.

Anyway, I decided to pick the game up and will test it out myself tonight. I noticed it's usually the same people that say this game runs fine also said they didn't have any stuttering in the first game. Anyone that says the first game didn't stutter for them has absolutely zero credibility as they are beyond oblivious. I have played the first game on numerous configurations over the years and it stutters in the exact same places every single time and is easily reproducible. In fact, I used to get into the debates on the steam forums and not one goddamn person could prove me wrong when I said show me a video in that exact area that I would give for an example.
I believe the e-cores thing worked for someone. There have been numerous major patches/content releases for Apex (a Respawn game) that have caused major crashing/instability errors for players. EVERY TIME you get tons of "fixes" from people that work for a handful of players. But the truth is if you just wait several days they fix the problem in an ensuing patch.
 
I just finished my replay of Fallen Order so I think I'm going to jump into Jedi: Survivor without waiting for further patches as the gameplay is getting excellent reviews...I absolutely loved the first game and I'm familiar again with the combat, gameplay and story so it's best to jump right into the sequel...I know I'm going to re-play Jedi: Survivor soon so hopefully by that time it's in much better shape performance wise
 
And it would make it the only game that has a performance problem such a stuttering with those e cores. There are only a handful of games that even lose any performance and it's usually just a few percent. But overall, those efficiency cores give the same or better performance and in most cases make the overall experience smoother. That's not even factoring what they do outside of gaming. So no way in hell I'm going to disable them for a single defective game even if that was the fix. And I'm still calling BS on that so-called fix because again there is report after a report after report from very trustworthy review sites that have the same exact stuttering on CPUs that do not even use efficiency cores.

Anyway, I decided to pick the game up and will test it out myself tonight. I noticed it's usually the same people that say this game runs fine also said they didn't have any stuttering in the first game. Anyone that says the first game didn't stutter for them has absolutely zero credibility as they are beyond oblivious. I have played the first game on numerous configurations over the years and it stutters in the exact same places every single time and is easily reproducible. In fact, I used to get into the debates on the steam forums and not one goddamn person could prove me wrong when I said show me a video in that exact area that I would give for an example.
I have to disable them all the time to run legacy applications that are way the hell out of support. It's not like all software has to support new trends. Plenty of shit out there that is well past it's expiration date that is still functional and playable.

You don't have to disable the cores forever. Just curious, you gonna be doing intense productivity, running 15 other games sessions in the background, while playing Jedi: Survivor? If not, turn em off, play the effing game and turn em on when you're doing other shit. It's effective at circumventing the issues we are having with the game before all the bugs get ironed out.

It's amazing how many of you on here whine about having to do things that actually work to get the game running and act like it's some fucking conspiracy against you, specifically, that you can't achieve 120 FPS. If you don't want to do that, the rest of us that are playing the game and enjoying it don't want to hear about your drama about how you refuse to run the game until they fix it just for you. Either that, or you're one of those people that haven't bought the game yet and are jumping on the drama wagon.
 
Last edited:
I have to disable them all the time to run legacy applications that are way the hell out of support. It's not like all software has to support new trends. Plenty of shit out there that is well past it's expiration date that is still functional and playable.

You don't have to disable the cores forever. Just curious, you gonna be doing intense productivity, running 15 other games sessions in the background, while playing Jedi: Survivor? If not, turn em off, play the effing game and turn em on when you're doing other shit. It's effective at circumventing the issues we are having with the game before all the bugs get ironed out.

It's amazing how many of you on here whine about having to do things that actually work to get the game running and act like it's some fucking conspiracy against you, specifically, that you can't achieve 120 FPS. If you don't want to do that, the rest of us that are playing the game and enjoying it don't want to hear about your drama about how the you refuse to run the game until they fix it just for you. Either that, or you're one of those people that haven't bought the game yet and are jumping on the drama wagon.
If I have to disable part of my hardware for a game to function properly then that goddamn game is broken. And later when I get home I will link to a couple of videos that show overall, it is much better to have those efficiency cores enabled for gaming. And in both of the videos that I will link, there was not a single game that had stuttering issues. And if anything, the efficiency cores made the overall experience in gaming better, never mind overall computing. Hell, those efficiency cores are a tremendous help during the shader compilation, especially in the Last of us.
 
If I have to disable part of my hardware for a game to function properly then that goddamn game is broken. And later when I get home I will link to a couple of videos that show overall, it is much better to have those efficiency cores enabled for gaming. And in both of the videos that I will link, there was not a single game that had stuttering issues. And if anything, the efficiency cores made the overall experience in gaming better, never mind overall computing.
Or, the CPU hardware is so new that Windows has problem with scheduling. So should we say Windows 10/11 is broken?

Should we also state that all the games listed (Which a lot are fixed now) is the game being broken?

Incompatible games (Windows 11)

  • Anthem
  • Bravely Default 2
  • Fishing Sim World
  • Football Manager 2019
  • Football Manager Touch 2019
  • Football Manager 2020
  • Football Manager Touch 2020
  • Legend of Mana
  • Mortal Kombat 11
  • Tony Hawks Pro Skater 1 and 2
  • Warhammer I
  • Assassin’s Creed: Valhalla
  • Far Cry Primal
  • Fernbus Simulator
  • For Honor
  • Lost in Random
  • Madden 22
  • Maneater
  • Need for Speed – Hot Pursuit Remastered
  • Sea of Solitude
  • Star Wars Jedi Fallen Order <-----------------LOOK
  • Tourist Bus Simulator
  • Maneater
Incompatible games (Windows 10)

  • All of the above, plus:
  • Ace Combat 7
  • Assassins Creed Odyssey
  • Assassins Creed Origins
  • Code Vein
  • eFootball 2021
  • F1 2019
  • Far Cry New Dawn
  • FIFA 19
  • FIFA 20
  • Football Manager 2021
  • Football Manager Touch 2021
  • Ghost Recon Breakpoint
  • Ghost Recon Wildlands
  • Immortals Fenyx Rising
  • Just Cause 4
  • Life is Strange 2
  • Madden 21
  • Monopoly Plus
  • Need For Speed Heat
  • Scott Pilgrim vs The World
  • Shadow of the Tomb Raider
  • Shinobi Striker
  • Soulcalibur VI
  • Starlink
  • Team Sonic Racing
  • Total War Saga - Three Kingdoms
  • Train Sim World
  • Train Sim World 2
  • Wolfenstein Youngblood
 
If I have to disable part of my hardware for a game to function properly then that goddamn game is broken. And later when I get home I will link to a couple of videos that show overall, it is much better to have those efficiency cores enabled for gaming. And in both of the videos that I will link, there was not a single game that had stuttering issues. And if anything, the efficiency cores made the overall experience in gaming better, never mind overall computing. Hell, those efficiency cores are a tremendous help during the shader compilation, especially in the Last of us.
You're thinking in a very linear fashion. You do you.
 
Or, the CPU hardware is so new that Windows has problem with scheduling. So should we say Windows 10/11 is broken?

Should we also state that all the games listed (Which a lot are fixed now) is the game being broken?

Incompatible games (Windows 11)

  • Anthem
  • Bravely Default 2
  • Fishing Sim World
  • Football Manager 2019
  • Football Manager Touch 2019
  • Football Manager 2020
  • Football Manager Touch 2020
  • Legend of Mana
  • Mortal Kombat 11
  • Tony Hawks Pro Skater 1 and 2
  • Warhammer I
  • Assassin’s Creed: Valhalla
  • Far Cry Primal
  • Fernbus Simulator
  • For Honor
  • Lost in Random
  • Madden 22
  • Maneater
  • Need for Speed – Hot Pursuit Remastered
  • Sea of Solitude
  • Star Wars Jedi Fallen Order <-----------------LOOK
  • Tourist Bus Simulator
  • Maneater
Incompatible games (Windows 10)

  • All of the above, plus:
  • Ace Combat 7
  • Assassins Creed Odyssey
  • Assassins Creed Origins
  • Code Vein
  • eFootball 2021
  • F1 2019
  • Far Cry New Dawn
  • FIFA 19
  • FIFA 20
  • Football Manager 2021
  • Football Manager Touch 2021
  • Ghost Recon Breakpoint
  • Ghost Recon Wildlands
  • Immortals Fenyx Rising
  • Just Cause 4
  • Life is Strange 2
  • Madden 21
  • Monopoly Plus
  • Need For Speed Heat
  • Scott Pilgrim vs The World
  • Shadow of the Tomb Raider
  • Shinobi Striker
  • Soulcalibur VI
  • Starlink
  • Team Sonic Racing
  • Total War Saga - Three Kingdoms
  • Train Sim World
  • Train Sim World 2
  • Wolfenstein Youngblood
And how old is that list? Last time I looked they said every single game is compatible with efficiency cores on Windows 11. In fact, that was one of the things I researched before I decided to fool with going with the CPU like a 13700k instead of an AMD CPU. I've been playing games for months now and there hasn't been one single goddamn game that's had an issue on my end because of the efficiency cores.
 
And how old is that list? Last time I looked they said every single game is compatible with efficiency cores on Windows 11. I've been playing games for months now and there hasn't been one single goddamn game that's had an issue on my end because of the efficiency cores.
Well good for you! Doesn't mean there isn't an issue with the E-Cores in games. IMO if you aren't willing to help troubleshoot issues for people so we can make this fantastic game work for them....Then why do you keep posting?

You aren't bringing much to this thread other than complaining. A lot of people have tried to help you, yet you seem to just complain over and over about the issue we all know everyone is having.
 
Back
Top