Eclipse
will throw the below error when it runs out of memory. It normally occurs on big projects when performing memory-intensive operations, such as building workspace.
An internal error occurred during: "Building workspace". GC overhead limit exceeded
Memory is allocated via Eclipse
‘ configuration file which is executed during Eclipse
‘ startup. It is located in its installation directory and named eclipse.ini
. The default values are good enough for small projects, but not suffice for the bigger ones.
Below is the file’s content for Eclipse Neon;
-startup plugins/org.eclipse.equinox.launcher_1.3.200.v20160318-1642.jar --launcher.library plugins/org.eclipse.equinox.launcher.gtk.linux.x86_64_1.1.400.v20160518-1444 -product org.eclipse.epp.package.java.product --launcher.defaultAction openFile -showsplash org.eclipse.platform --launcher.defaultAction openFile --launcher.appendVmargs -vmargs -Dosgi.requiredJavaVersion=1.8 -XX:+UseG1GC -XX:+UseStringDeduplication -Dosgi.requiredJavaVersion=1.8 -Xms256m -Xmx1024m
Fixing the problem means allocating more memory for Eclipse
. This is done by increasing the values of some parameters in the configuration file. You should only be concerned of the following lines;
-Xms256m -Xmx1024m
Each lines allocate memory in Megabytes
for different aspect of the program, namely the followings;
There’s no one-size-fits-all solution for this issue as it depends on how much memory you need versus how much memory you have. You can start by probably doubling the amounts as in the below example and then go from there.
-Xms512m -Xmx2048m
The changes will take effect after Eclipse
is restarted.
Managing bandwidth is crucial for servers, especially when multiple clients download large files simultaneously. By controlling the speed or throttling download speeds, you can provide a balanced and efficient …
Read More
Managing bandwidth is crucial for servers, especially when multiple clients download large files simultaneously. By controlling the speed or throttling download speeds, you can provide a balanced and efficient …
Read More
PHP
will exit with the following error once it reaches the maximum allowed time for it to execute your scripts;
Fatal error: Maximum execution time of 30 seconds exceeded in yourscript.php
30 seconds is the default timeout, if not specifically configured.
You can avoid this error by increasing the maximum execution time for PHP
. This can be set globally or from within your PHP
scripts from these options;
Node.js is very popular and is available in all popular platforms including Ubuntu. It is also very fast-moving, and some Linux distributions, including Ubuntu, do not provide the latest …
Read More
Restore points for System Restore can be both automatically and manually created. If not checked, it might use more storage space than you're comfortable with over time.
Apache, by default, will serve files to the users as fast as the bandwidth or TCP/IP allows. In an environment where you are hosting files for download, and you …
Read More
PHP scripts are only allocated a certain amount of memory that it can use, and whet it reaches the limit, it will produce the following error;
PHP Fatal error: Allowed memory size of xxxx bytes exhausted (tried to allocate yyyy) in yourscript.php
To fix this, you’ll need to increase the memory limit for PHP scripts using any of the following methods;
Maximum upload file size for PHP
is bound to the lowest value of both post_max_size
and upload_max_filesize
directives in your configuration. post_max_size
affects maximum file upload size as file upload is normally an HTTP
POST
operation.
post_max_size
Maximum size of POST data that PHP will accept.
Its value may be 0 to disable the limit.
It is ignored if POST data reading is disabled through enable_post_data_reading
.
upload_max_filesize
Maximum allowed size for uploaded files.
You can update your PHP
configuration file for these two directives to the values that fit your requirement and then restart your web server.
The following example allows for file upload of not more than 200MB.
post_max_size = 200M upload_max_filesize = 250M
Alternatively, you can add the following lines in your .htaccess
and the setting will apply to scripts from within the .htaccess
‘ directory.
php_value upload_max_filesize 200M php_value post_max_size 250M
Restart Apache
for the changes to take effect.
The command used to list directory content in Linux is ls. With the -R option, ls will traverse the directory recursively, showing the content of the particular directory and all its subdirectories. Relative directory path is displayed before the directory content is actually listed.
The following is an example of the command in use;
$ ls -R testdir/ testdir/: subdir1 subdir2 testdir/subdir1: subsubdir1 subsubdir2 testdir/subdir1/subsubdir1: file1 testdir/subdir1/subsubdir2: file2 testdir/subdir2: subsubdir1 subsubdir2 testdir/subdir2/subsubdir1: file3 testdir/subdir2/subsubdir2: file4