Quantcast
Channel: Forums - Geodatabase & ArcSDE
Viewing all 1588 articles
Browse latest View live

Different results using sql and st_geometry functions

$
0
0
I wanted to know how many features from my parcel layer intersect the required municipality. The parcel feature class is versioned and there are are several versions created in our database. Using sql I wrote the following statement...

SELECT count(*)
FROM parcels p,municipios m
WHERE sde.st_intersects (p.shape,m.shape) = 1
AND m.municipios='desired_one'
order by p.pin;

this returns 7434 records. Using select by location in ArcMap, the application returned 7436 selected features. Both queries were done relatively at the same time. I ran "exec sde.version_util.set_current_version('SDE.DEFAULT');" just in case in my latest tries, but still the same results. What can explain the difference between this approaches? Is it an expected behaviour? Any ideas or suggestions on how to obtain the same results?

Thanks

incorrect results with st_intersects

$
0
0
Hopefully this is an easy one...

Using st_intersects I'm trying to get the records from a polygon layer which centroids intersect with another polygon layer (administrative boundaries). So far I managed to get this with the following statement...

Select p.pid,p.oldpid,m.muni,m.region
from parcels p, munici m
where sde.st_intersects(m.shape,sde_centroid(p.shape))=1
and rownum<40;

the results are not correct. It returned pins that intersect different administratives boundaries but listed to the same boundary. In other words...

"242-083-125-07";"242-000-007-36";"AGUADILLA";"AGUADILLA"
"197-070-419-10";"197-070-069-49";"AGUADILLA";"AGUADILLA"
"115-092-802-AV";"115-092-802-AV";"AGUADILLA";"AGUADILLA"
"045-100-185-58";"045-000-010-88";"AGUADILLA";"AGUADILLA"
"";"115-092-802-15";"SAN JUAN";"AGUADILLA";"AGUADILLA"
"168-005-001-20";"168-005-001-20";"AGUADILLA";"AGUADILLA"

I tried the same in my Postgis database (not using ESRI) and the results appear to be correct. I used the following in postgis...

SELECT p.pid,p.oldpid,m.muni,m.region
FROM parcels p,munic m
WHERE st_intersects (m.geom,st_centroid(p.geom)) limit 38;

results:

"242-083-125-07";"242-000-007-36";"JAYUYA";"PONCE"
"197-070-419-10";"197-070-069-49";"AGUAS BUENAS";"CAGUAS"
"115-092-802-AV";"115-092-802-AV";"SAN JUAN";"SAN JUAN"
"045-100-185-58";"045-000-010-88";"AGUADILLA";"AGUADILLA"
"";"115-092-802-15";"SAN JUAN";"SAN JUAN"
"168-005-001-20";"168-005-001-20";"COROZAL";"BAYAMÓN"

Could it be something wrong in the statement or maybe something else??? Any suggestions are appreciated...

Linking a domain with a table,

$
0
0
Linking a domain with a table,

I’m wondering if there is a way to link a particular domain with a table such that the values of domain are updated as the values of table are updated.

For example, I have domain named “CommunityNameForParcel_Arabic” which is supposed to read from the table named “N.DBO.CommunityNameForParcel_Domain_Arabic” as shown in the screenshot below:


Attachment 30784

Is that possible?



Thank you

Best

Jamal
Attached Thumbnails
Click image for larger version

Name:	Clip_872.jpg‎
Views:	N/A
Size:	348.1 KB
ID:	30784  

Geodatabase Question

$
0
0
I am creating a geodatabase with feature classes that are point in Arc. And I need to attach Excel data from one document to each row in the feature classes attribute table. I have many attribute table rows in each feature class so I am looking for a faster way then the way I figured out. I know I can add a Column with the same ID as in the Attribute table and then join the table to the pointfile in arc and then import it into my geodatabase as feature class. Please help!

SDE and DBMS_CRYPTO

$
0
0
why execute permission on DBMS_CRYPTO package needed by "sde" user while performing Geodatabase upgrade ?

this privilege is required since ArcGIS Version 10.1

SHAPE.STArea()field in SQL 2008

$
0
0
I recently migrated my SDE geodatabase from SQL 2005 to SQL 2008. The SHAPE.area and SHAPE.len fields automatically changed to SHAPE.STArea() and SHAPE.STLencth(). I believe this is because of the new geometry data types available in SQL 2008 and later. A simple change in the field names doesn't really present a problem until I export feature classes to a file geodatabase.

When feature classes are exported the SHAPE.STArea() and Length fields are retained, and additional Shape_Area and Shape_Length fields are created. Now there are redundant area and length fields and none can be deleted. Also, the area fields are different by as much as several square feet in some cases.

How can I avoid or work around these additional fields. I replicate the entire SDE database as a file geodatabase, so one by one featureclass manipulation isn't ideal.

Thanks,

Justin

Archive classes do not utilize spatial indexes.

$
0
0
Hello all,

Would anyone know why spatial indexes are not being utilized when I load an archive feature class?

I am using ArcGIS Desktop/Server 10.2.1, I have a PostgreSQL 9.2 instance hosting a 10.2.1 enterprise geodatabase.

In my database, I have some relatively large feature classes (millions of records per feature class). When I load the default view of a versioned feature class into ArcMap, and zoom in to a small region on the map, the features render very fast. With the Geodatabase History toolbar, I can then add the archive class to the map for the same feature class, the same spatial extent will take a very long time to refresh.

I have logged the statements being issued to the underlying database, and it seems pretty clear to me that the entire table is being scanned every time ArcMap refreshes the layer.

Here's what the database log looks like when the default view is refreshed:

Code:

2014-01-24 14:39:42 EST LOG:  execute sde_1390592382_0_4402: SELECT table_name, time_last_modified FROM sde.sde_tables_modified
2014-01-24 14:39:42 EST LOG:  statement: DEALLOCATE sde_1390592382_0_4402
2014-01-24 14:39:42 EST LOG:  statement: DEALLOCATE sde_1390592382_0_4404
2014-01-24 14:39:42 EST LOG:  execute sde_1390592382_0_4404: DECLARE sdecur_4404_16241 BINARY CURSOR WITH HOLD FOR  select  V__32.shape,  objectid,  representation,  override  from ( {very_long_sql_statement} ) V__32  where  ((V__32.shape && $7) = 't')
2014-01-24 14:39:42 EST DETAIL:  parameters: $1 = '18029', $2 = '18047', $3 = '18029', $4 = '18047', $5 = '18029', $6 = '18047', $7 = '380000000500000008001000E610000020000000000000009BA3A78092108FECAD96921A9AEA98020000A9869C01DAEA98020000E9869C01'
2014-01-24 14:39:42 EST LOG:  statement: FETCH FORWARD 100 from sdecur_4404_16241
2014-01-24 14:39:42 EST LOG:  statement: FETCH FORWARD 1000 from sdecur_4404_16241
2014-01-24 14:39:42 EST LOG:  statement: CLOSE sdecur_4404_16241
2014-01-24 14:39:42 EST LOG:  statement: DEALLOCATE sde_1390592382_0_4404

It is easy to spot the geometry being passed into the statement as parameter '$7', which I believe is the map extent being used to limit the query results to just include features that should be currently visible in the display.

Now, when the archive class is refreshed at the same map extent, the logged statements look like this:

Code:

2014-01-24 14:38:31 EST LOG:  execute sde_1390592311_0_4394: SELECT table_name, time_last_modified FROM sde.sde_tables_modified
2014-01-24 14:38:31 EST LOG:  statement: DEALLOCATE sde_1390592311_0_4394
2014-01-24 14:38:31 EST LOG:  statement: DEALLOCATE sde_1390592311_0_4396
2014-01-24 14:38:31 EST LOG:  execute sde_1390592311_0_4396: DECLARE sdecur_4396_16241 BINARY CURSOR WITH HOLD FOR  select  buildings_h.shape  from  test.SDE.BUILDINGS_H 
2014-01-24 14:38:31 EST LOG:  statement: FETCH FORWARD 100 from sdecur_4396_16241
2014-01-24 14:38:31 EST LOG:  statement: FETCH FORWARD 1000 from sdecur_4396_16241
... many more 'fetch forward' statements ...
2014-01-24 14:38:43 EST LOG:  statement: FETCH FORWARD 1000 from sdecur_4396_16241
2014-01-24 14:38:43 EST LOG:  statement: CLOSE sdecur_4396_16241
2014-01-24 14:38:43 EST LOG:  statement: DEALLOCATE sde_1390592311_0_4396
2014-01-24 14:38:43 EST LOG:  statement: COMMIT
2014-01-24 14:38:43 EST LOG:  statement: BEGIN
2014-01-24 14:38:43 EST LOG:  statement: COMMIT
2014-01-24 14:38:43 EST LOG:  statement: BEGIN

If I add up all of the 'fetch forward' statements (1000 records each), the total is equivalent to the number of records in the table. I'm not sure if maybe I've done something wrong, but I definitely don't want this to be happening...it takes about 10-15 seconds just for the one feature class in this case, every time the map refreshes.

I have found a workaround. I created a database view of the archive feature class (with a simple definition specified as 'select * from tablename_h'). When I load that view into ArcMap, and answer the prompt for the objectid field (for which I pick gdb_archive_oid), it appears ArcMap will use a spatial extent to limit features when it queries the view:

Code:

2014-01-24 15:16:19 EST LOG:  execute sde_1390594579_0_17731: SELECT lineage_name, time_last_modified FROM sde.sde_lineages_modified WHERE lineage_name = $1
2014-01-24 15:16:19 EST DETAIL:  parameters: $1 = '-1'
2014-01-24 15:16:19 EST LOG:  statement: DEALLOCATE sde_1390594579_0_17731
2014-01-24 15:16:19 EST LOG:  statement: SAVEPOINT sp_sde_1390594579_0_17730
2014-01-24 15:16:19 EST LOG:  execute sde_1390594579_0_17730: DECLARE sdecur_17730_16241 BINARY CURSOR WITH HOLD FOR  select shape from ( {very_long_sql_statement}) a where (shape && sde.ST_GeomFromWKB($1,$2)) = 't'
2014-01-24 15:16:19 EST DETAIL:  parameters: $1 = '\x0103000000010000000500000064bd625724ad5ec0d85e3b20e28e484064bd625724ad5ec0c06768ccfc8e4840309eba520cad5ec0c06768ccfc8e4840309eba520cad5ec0d85e3b20e28e484064bd625724ad5ec0d85e3b20e28e4840', $2 = '4326'
2014-01-24 15:16:19 EST LOG:  statement: FETCH FORWARD 100 from sdecur_17730_16241
2014-01-24 15:16:19 EST LOG:  statement: RELEASE SAVEPOINT sp_sde_1390594579_0_17730
2014-01-24 15:16:19 EST LOG:  statement: FETCH FORWARD 1000 from sdecur_17730_16241
2014-01-24 15:16:19 EST LOG:  statement: CLOSE sdecur_17730_16241
2014-01-24 15:16:19 EST LOG:  statement: DEALLOCATE sde_1390594579_0_17730

At this point, I have two questions. First, is this a known/expected behaviour of ArcGIS when working with archive classes, or might it be considered a bug? Second, is there any inherent problem with the workaround of using a view?

Querying multiple SDE Databases

$
0
0
Our office has multiple SDE databases on an SQL server. Each SDE database represents a County within our state and each County SDE Database has the same feature classes and attributes table parameters.

Is there an existing tool or a script that you use which will allow us to querry all of our county databases at one time for a specific attribute total or value?

Example: We want to determine the total acrage of State Forest. Each county database has a feature class called 'state_forest_area' and in that feature class is the attribute column for 'total_acres'. What we would like the tool/script to do is access each County Database and pull the total acres from each 'state_forest_area' feature class and total them. OR at the least, have an output of all the 'state_forest_area' records, then we could total them ourselves.

ArcSDE Geodatabase Upgrade Pre-Requisite Check (Discussion)

$
0
0
Hi,

After performing many Geodatabase Upgrades for many instances over the the past few years, i find the "Upgrade Pre-Requisite Check " tool usless in terms of detecting real upgrade problems.

Many times the tool runs and states that your geodatabase is fine and ready for upgrade, and when your perfom the REAL upgrade ....BOOM... actual problems shows up !!!!.

This is usless when you want to prepare for a PRODUCTION upgrade, since you don't want to be stucked with complicated problems in production, and the only option you have is "restore" and your scheduled upgrade is failed.

from my observation the Pre-Requisite Check tool checks (privileges, orphan objects,domains....) but it doesn't go deeper in analysis.

please share your experience and openions.......


Cheers

Client machine having difficulty opening MXD's on shared drive?

$
0
0
Got a call from a colleague that a client machine was having difficulty accessing saved MXD's on a shared network drive. The machine could open the MXD's no problem, but would experience the dreaded red exclamation points for all the feature classes and tables that are located on the enterprise GDB. Right clicking and looking at the data source, something weird was definitely going on:

Attachment 30881

Note the ALL CAPS ... never seen that before when looking at the data source of an FC or Table.

Funny thing is, when one manually resets the data source (right click the FC/Table, goes to properties, "Set Data Source"), the layer works just fine.



Here's a look at what a "working" feature class should look like when accessing from our database:

Attachment 30882

I should note that this is only occurring on only one client machine in an organization of about 7 or so machines. Additionally, I should also point out that this organization received a CPU upgrade and software upgrade to ArcGIS 10.1 around the September 2013 time frame ... however, this particular problem has only occurred in the last week or so on the one machine in question.



This problem has really got me and my colleagues stumped ... we would really like to be able to open MXD's without having to repair every single path!
Attached Thumbnails
Click image for larger version

Name:	Bad.png‎
Views:	N/A
Size:	64.3 KB
ID:	30881   Click image for larger version

Name:	Good.png‎
Views:	N/A
Size:	72.6 KB
ID:	30882  

Excel Data into ArcGIS

$
0
0
Hi all,

I am trying to import/attach excel data (MaxTemp, MinTemp, precipitation) to a multi-point shapefile. Is there a faster way of doing it other than editing each point shapefile attribute table?

MO

Exporting large geodatase tables produces corrupt files?

$
0
0
I am attempting to export the contents of a geodatabase table which includes four attributes/columns, and ~ 225 million records. Unfortunatly, (in addition to being very slow, ~ 8 hours), the resulting output file is incomplete/corrupted. I am able to read records in the file up to a point, beyond which the records are empty (though the file is still structured as though the total number of records is present). I have tried exporting the table in smaller pieces, but with the same result. The location in the file at which the records ends varies with each export (the maximum so far is ~64 million, the minimum has been ~28 million). Does anyone know if:

1) There is an inherent limit in the number of records being exported (note, they are being exported to a text file, and are not being loaded into memory).

2) Are there alternate ways of exporting tables of this size?

Thanks!
Talbot

import data from MySQL to ArgGIS Server

$
0
0
I have about 100GB of data in a MySQL database that contains geospatial information. I am interested in importing the data into ArgGIS Server 10.2 that is backed by a PostgreSQL database. Is there an easy way to do this? I may receive the data in the form of CSV data from a MySQL data dump and then I want to import it into a feature class. One thought I had was to import the data via GeoEvent Processor but I wanted to see if there is an easier way.

Could someone please provide me some help or resources on how to perform this task.

Thank you,
Mark

Comparing/Importing replica schema changes

$
0
0
I have a two-way replica and added a couple subtypes on a feature class in one of the geodatabases.

When I run the compare replica tool, it doesn't notice the subtype change.

Is this normal?

What's the proper way of keeping the schemas the same? Do I have to add the subtypes manually on each replica?

Automatically generating unique sample numbers in the field

$
0
0
My group (6-8 geologists) collects data in the field using handheld computers running ArcGIS for Mobile, synchronized with a file geodatabase each evening.
Each data point is given a unique ID number, and any samples associated with that point are linked to it using this ID. We currently type the ID numbers in by hand, and we make quite a few typos which cause problems for us down the road (for example, collecting two different data points and giving them the same unique ID number).
The unique ID needs to be known in the field so that samples can be labeled appropriately. The current scheme is alphanumeric, e.g. 14ET001 (the year, the geologist's initials, the sequential data point number) but it doesn't have to be that way necessarily.
Seems like others would have encountered this same issue. Is there a way to make our unique ID field auto-populate with the next number in the sequence? Is there a way to warn or prevent people from recording the same unique ID twice?
Is this something that the current version of ArcGIS/ArcGIS for Mobile can do? Any ideas for workarounds?
Thanks!

Blank Screens when Adding Shapefiles or Tables to Geodatabase in ArcGIS 10.2.1

$
0
0
I signed up for the 60-day free trial for a class, so I am authorized under the Advanced (ArcInfo) Single Use software package for ArcGIS for Desktop option:

I can't import any Shapefiles into my new Geodatabase. I get a blank screen (see attached WORD file for screenshots). I've tried to do this in ArcCatalog and ArcMap, and the same problem persists.

Any advice is appreciated. I've called the Esri Help phone number, and was referred to this website.

Thanks,
Kirk
Attached Files

DBMS table not found (-37) - I tried to create a view

$
0
0
Hello,

I have a problem tried to create a view with ArcSDE.

I have a layer from an user on SDE instance, and a table from another instance, but i create a DBLINK to this instance and i create a SYNONIM.

When i do a sdetable -o describe from de table i see that from sde user, but when i tried to create the view:

sdetable -o create_view -T CVINCI -t INCIDENCIAS,GEVEXPED -c INCIDENCIAS.SHAPE,INCIDENCIAS.CODIGO,GEVEXPED.ESTADO -w "INCIDENCIAS.CODIGO=GEVEXPED.CODIGO" -s arcgis02 -u xxxxx -p xxxxxx

I recived the error:
DBMS table not found (-37)

I'm using ArcSDE 10.2, and a direct conection.

Any idea?

thanks.

Appending data from Double field to SDE changes the precision and scale in SDE

$
0
0
Hi I just created a new feature class in SDE with Short integers, text, and two double fields.

I set the double field to have scale: 7, precision: 2.

After I run my python script to append the data for the first time, the two double fields in SDE now have scale: 38, precision: 8. Why is this happening? Every other field is fine. The data in the two double fields only contain 2 decimal places and the the values are all small ranging from 0.00 to 12.95. We use SQL Server 2008.

Thank you for any help.

CREATE DATABASE USER TOOL and tablespace best pratice

$
0
0
Hi,
I have setting up geodatabase on oracle11gR2.
I created geodatabase within a tablespace named SDE_TBS (the default name for sde user).

Now, I am creating new users using "create database user tool" and I can specify a tablespace name.

Are there best pratices for tablespace to use?

Which tablespace should i use for each new user?
- Oracle default tablespace
- SDE_TBS (same SDE USER)
- a new tablespace (<newusername>_TBS)

Thanks a lot
Giuseppe P.

SQL service stops inmidiatly I Start it

$
0
0
Hi,

I was trying to access to my geodatabase and I got an error "Faileure to access the DBMS server". So I went and check sql and notice that the service was not running or stopped Attachment 30941 it started with no problem, after I check my database I got the same error. I check properties of my services and even tough my service said it is running it is not Attachment 30942 Attachment 30942 this happen suddenly because it was working fine.
I checked SQL server Management studio and I got this error when I try to connect to it Attachment 30943

in the log error file I got: access denied but I have administrator privileges.

Error: 17204, Severity: 16, State: 1.
2014-01-28 14:12:33.34 spid15s FCB::Open failed: Could not open file e:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Data\GIS_ADM_TERRENOS.mdf for file number 1. OS error: 5(Access is denied.).
2014-01-28 14:12:33.35 spid15s Error: 5120, Severity: 16, State: 101.
2014-01-28 14:12:33.35 spid15s Unable to open the physical file "e:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Data\GIS_ADM_TERRENOS.mdf". Operating system error 5: "5(Access is denied.)".
2014-01-28 14:12:33.40 spid6s The resource database build version is 11.00.2100. This is an informational message only. No user action is required.
2014-01-28 14:12:33.42 spid15s Error: 17207, Severity: 16, State: 1.
2014-01-28 14:12:33.42 spid15s FileMgr::StartSecondaryDataFiles: Operating system error 2(The system cannot find the file specified.) occurred while creating or opening file 'e:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Data\VECTOR01_GIS_ADM_TERRENOS.ndf'. Diagnose and correct the operating system error, and retry the operation.
2014-01-28 14:12:33.42 spid15s Error: 5120, Severity: 16, State: 5.
2014-01-28 14:12:33.42 spid15s Unable to open the physical file "e:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Data\VECTOR01_GIS_ADM_TERRENOS.ndf". Operating system error 2: "2(The system cannot find the file specified.)".
2014-01-28 14:12:33.42 spid15s Error: 17207, Severity: 16, State: 1.
2014-01-28 14:12:33.42 spid15s FileMgr::StartSecondaryDataFiles: Operating system error 2(The system cannot find the file specified.) occurred while creating or opening file 'e:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Data\DELTA01_GIS_ADM_TERRENOS.ndf'. Diagnose and correct the operating system error, and retry the operation.
2014-01-28 14:12:33.42 spid15s Error: 5120, Severity: 16, State: 5.
2014-01-28 14:12:33.42 spid15s Unable to open the physical file "e:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Data\DELTA01_GIS_ADM_TERRENOS.ndf". Operating system error 2: "2(The system cannot find the file specified.)".
2014-01-28 14:12:33.42 spid15s Error: 17207, Severity: 16, State: 1.
2014-01-28 14:12:33.42 spid15s FileMgr::StartSecondaryDataFiles: Operating system error 2(The system cannot find the file specified.) occurred while creating or opening file 'e:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Data\RASTER01_GIS_ADM_TERRENOS.ndf'. Diagnose and correct the operating system error, and retry the operation.
2014-01-28 14:12:33.42 spid15s Error: 5120, Severity: 16, State: 5.
2014-01-28 14:12:33.42 spid15s Unable to open the physical file "e:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Data\RASTER01_GIS_ADM_TERRENOS.ndf". Operating system error 2: "2(The system cannot find the file specified.)".
2014-01-28 14:12:33.42 spid15s Error: 17207, Severity: 16, State: 1.
2014-01-28 14:12:33.42 spid15s FileMgr::StartLogFiles: Operating system error 2(The system cannot find the file specified.) occurred while creating or opening file 'e:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Data\GIS_ADM_TERRENOS_log.ldf'. Diagnose and correct the operating system error, and retry the operation.
2014-01-28 14:12:33.42 spid15s File activation failure. The physical file name "e:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Data\GIS_ADM_TERRENOS_log.ldf" may be incorrect.
2014-01-28 14:12:33.43 Server The SQL Server Network Interface library successfully registered the Service Principal Name (SPN) [ MSSQLSvc/VMS-GIS02.ADM_TERRENOS.LOCAL ] for the SQL Server service.
2014-01-28 14:12:33.43 Server The SQL Server Network Interface library successfully registered the Service Principal Name (SPN) [ MSSQLSvc/VMS-GIS02.ADM_TERRENOS.LOCAL:1433 ] for the SQL Server service.
2014-01-28 14:12:33.50 spid6s Starting up database 'model'.
2014-01-28 14:12:33.56 spid6s Clearing tempdb database.
2014-01-28 14:12:33.57 spid6s Error: 5123, Severity: 16, State: 1.
2014-01-28 14:12:33.57 spid6s CREATE FILE encountered operating system error 5(Access is denied.) while attempting to open or create the physical file 'e:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Data\tempdb.mdf'.
2014-01-28 14:12:33.57 spid6s Error: 17204, Severity: 16, State: 1.
2014-01-28 14:12:33.57 spid6s FCB::Open failed: Could not open file e:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Data\tempdb.mdf for file number 1. OS error: 5(Access is denied.).
2014-01-28 14:12:33.57 spid6s Error: 5120, Severity: 16, State: 101.
2014-01-28 14:12:33.57 spid6s Unable to open the physical file "e:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Data\tempdb.mdf". Operating system error 5: "5(Access is denied.)".
2014-01-28 14:12:33.57 spid6s Error: 1802, Severity: 16, State: 4.
2014-01-28 14:12:33.57 spid6s CREATE DATABASE failed. Some file names listed could not be created. Check related errors.
2014-01-28 14:12:33.57 spid6s Could not create tempdb. You may not have enough disk space available. Free additional disk space by deleting other files on the tempdb drive and then restart SQL Server. Check for additional errors in the event log that may indicate why the tempdb files could not be initialized.
2014-01-28 14:12:33.57 spid6s SQL Trace was stopped due to server shutdown. Trace ID = '1'. This is an informational message only; no user action is required.
2014-01-28 14:12:34.59 spid6s Error: 25725, Severity: 16, State: 1.

Hope anyone can help me with this...

thanks
DLL
Attached Thumbnails
Click image for larger version

Name:	SQL not start.JPG‎
Views:	N/A
Size:	110.6 KB
ID:	30941   Click image for larger version

Name:	SQL not start2.JPG‎
Views:	N/A
Size:	128.0 KB
ID:	30942   Click image for larger version

Name:	sql error.JPG‎
Views:	N/A
Size:	45.8 KB
ID:	30943  
Viewing all 1588 articles
Browse latest View live