I have the python script file_creator.py. That script has on purpose to take over 1037 CSV files and concatenate them into one single file. The problem with it is each time I run the script, my whole system freeze. How could I prevent my CPU from reaching 100%? I have that problem with many other scripts and I just don't know how to prevent that problem. Yet I have a very good computer.
How could I modify that script to make it work fine on my computer? I would like to advise how to handle that kind of problem. How could I watch the CPU to do not pas over 100%?
Hello, Software Engineer here, questions: why are you using Python for that task? Why not run Windows native applications for these jobs(If you are in a Windows environment)? I can evaluate your current scripts and give you a better and efficient solution. Kind regards
Hello!
I am a python developer.
I looked at your project and it seems interesting.
I have good experience in python and I am an expert in it.
I have all necessary skills required to be a good developer. I am interested in your.
Please check my portfolio and reviews.
Ping me to discuss in detail. Waiting for you response.
I suspect that your computer may be running out of RAM, causing it to write memory to the swap file on disk. So your CPU processes backup waiting for disk access.
I've worked with huge files that won't fit into memory on many occasion.
Hi. Generally when we need to ease up on CPU we should periodically yield execution to the system by using sleep. But in this case likely culprits are bulk memory operations executed by read_csv and pd.concat. Since we can't yield from inside of them we need to rewrite the program to load files line-by-line into preallocated dataframe. But i can see that you just use Pandas to read and write CSV files, for this you actually don't need Pandas in the slightest, it could be easily done by standard python. If you choose me please provide sample data and i'll figure out the best approach for you
Hello,
In order to fix cpu intensive application, we need to use profiler to catch the code block or function which consume too much cpu cycles. But there is another way, by looking at the algorithm. Please send me your python script, so i can analyze the algorithm.
Regards
Dear friend,
This is place holder bid. I need to see first whats happening and then I can tell and fix the problem. The cost to you will depend upon actual time spend.
an
Hi
I'm very interested in this project.
I have experience of building and debugging python applications.
I reviewed your code. It has a lot of CPU and memory intensive operations.
If your purpose is just concatenate files - CSV python standart library is enough.
Also there is a possibility to solve this task using windows command line only.(if on windows)
And the same for other OS
Relevant skills and experience
Python, Scrapy, Web Scraping
Thanks & Regards,
Heorhii
hi , i got 4 yer experience on python and 6 on linux , i think your code , could be optimized , but i need some example files to test it and the precise characteristics from you PC.